12
Catherine Bruce, PhD, Assistant Professor, Trent University, School of Education and Professional Learning, Canada Abstract: There has been a strong call for increased clarity and transpar- ency of method in qualitative research. Although qualitative data analysis has been detailed, data management has not been made as transparent in the literature. How do data collection and analysis interact in practical terms? What constitutes sufficient data? And can research be both planful and emergent? In this paper, the author highlights several methodological strat- egies for addressing data management challenges in a grounded theory study of preservice mathematics teachers. Keywords: grounded theory, data management, methodology, efficacy Citation Bruce, C. D. (2007). Questions arising about emergence, data collection, and its interaction with analysis in a grounded theory study. International Journal of Qualitative Methods, 6(1), Article 4. Retrieved [date] from http://www.ualberta.ca/~iiqm/backissues/6_1/bruce.pdf International Journal of Qualitative Methods 6 (1) March 2007 Questions Arising about Emergence, Data Collection, and Its Interaction with Analysis in a Grounded Theory Study Catherine D. Bruce

Questions Arising about Emergence, Data Collection, and ...iiqm/backissues/6_1/bruce.pdf · emergent? In this paper, the ... and qualitative studies, ... For a full discussion on

  • Upload
    ledung

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Catherine Bruce, PhD, Assistant Professor, Trent University, School of

Education and Professional Learning, Canada

Abstract: There has been a strong call for increased clarity and transpar-ency of method in qualitative research. Although qualitative data analysishas been detailed, data management has not been made as transparent in theliterature. How do data collection and analysis interact in practical terms?What constitutes sufficient data? And can research be both planful andemergent? In this paper, the author highlights several methodological strat-egies for addressing data management challenges in a grounded theorystudy of preservice mathematics teachers.

Keywords: grounded theory, data management, methodology, efficacy

CitationBruce, C. D. (2007). Questions arising about emergence, data collection,and its interaction with analysis in a grounded theory study. InternationalJournal of Qualitative Methods, 6(1), Article 4. Retrieved [date] fromhttp://www.ualberta.ca/~iiqm/backissues/6_1/bruce.pdf

International Journal of Qualitative Methods 6 (1) March 2007

Questions Arising about Emergence, DataCollection, and Its Interaction with Analysis ina Grounded Theory Study

Catherine D. Bruce

The call for transparency and clarity of qualitativeresearch methods (Anfara, Brown, & Mangione,2002; Demerath, 2006) has become more focused

and urgent as educational researchers attempt to ad-dress complex issues of “improvement” and the widereducation community makes demands for assurancesof credibility (that the study has internal reliability),transferability (that rich descriptions of data collectionand analysis methods allow for potential application inother contexts), and dependability (that a clear audittrail is articulated for inspection by the reader). Clarityand transparency of methods requires detailed docu-mentation and illustration for these demands to be met.Demerath has described this response as theelucidative response for reducing marginalization ofqualitative research.

It is well understood that there is a need for a widerange of research methods, including both quantitativeand qualitative studies, to capture trends and effects aswell as provide descriptions that explain how and whyoutcomes are achieved and phenomena are experi-enced. Yet, there continue to be broad assumptionsabout how qualitative and quantitative data are gath-ered and analyzed. These assumptions must be reex-amined regularly. Furthermore, authors of reports mustensure that methods are not oversimplified orunderdescribed. In this article, I examine particularcomplexities of data collection, management, and re-lated analysis based on a grounded theory study inmathematics education where resources were limited.Methodological issues arising in the study are magni-fied and scrutinized to illustrate specific strategies fordata management.

Questioning assumptions aboutqualitative and quantitative methods

A typical description of a quantitative study suggeststhat the method used is deductive: The conclusions fol-low necessarily from the premises. Researchers are ex-pected first to develop a hypothesis, use prior theory,and anticipate conclusions; then to collect data appro-priate to the anticipated conclusions; and, finally, toanalyze the data numerically. This involves using apreplanned, fixed structure (Creswell, 2005). On closerexamination, it becomes apparent that quantitativestudies often involve careful examination of data (e.g.,to determine how key variables are distributed and cor-related) before researchers launch into a main analysis.This frequently includes the reworking of researchquestions and subsequent literature review. Essen-tially, the frame of the study can change in response tothe data.

A typical description of a qualitative study suggeststhat the method used is inductive: reasoning from thespecific to a whole and focusing on the particularsrather than the general. Qualitative researchers are ex-pected to gather rich descriptive data and ground con-clusions and understandings in the data mined, notprior theories. It is the particulars that tell the story.This involves using an emerging, flexible structure(Creswell, 2005). On closer examination, however, itbecomes apparent that qualitative studies often involveovert planning (e.g., by creating start codes prior toanalysis) before the researcher launches into a mainanalysis. There are significant regularities in data col-lection and analysis procedures. Qualitative studiesalso have theoretical expectations that guide the collec-tion and analysis stages, particularly in the light of ethi-cal review requirements that must be met before theonset of any study.

With these complexities in mind, it is possible toimagine how qualitative and quantitative educationalresearch methods are closer than traditionally de-scribed. It seems as though the fundamental reasoningof the methods are similar. Perhaps educational re-searchers weave between induction and deduction inan iterative process. These deliberations are not new tomost educational researchers; however, descriptions ofmethods do not tend to acknowledge or illustrate thesecomplexities.

In a qualitative study of preservice teacher efficacyin mathematics, issues of methodological credibility,transferability, and dependability all surfaced. Thestudy examined how preservice teacher efficacychanged as teacher candidates were engaged inpracticum settings at schools and in a reform-basedmathematics methods course. Because there are veryfew qualitative studies of preservice teacher efficacy,and because the study was to be grounded in teachercandidate experiences, constructivist grounded theory(Charmaz, 1994, 2003) was used as the researchmethod.

Grounded theory incorporates a number of quanti-tative-like procedures, such as data saturation require-ments; prescriptive coding systems of open, axial, andselective coding; and code counts. Because groundedtheory shares some characteristics with quantitativemethods (Creswell, 2005; Greckhamer & Koro-Ljungberg, 2005) but is clearly positioned in the quali-tative tradition, it offers an interesting lens throughwhich distinctions between quantitative and qualita-tive data collection and analysis methods appear lessabsolute.

International Journal of Qualitative Methods 6 (1) March 2007

http://www.ualberta.ca/~ijqm/

2 Bruce INTERACTIONS WITH GT ANALYSIS

An example study inmathematics teacher efficacy

I undertook a study of preservice teacher mathematicsefficacy to examine when, how, and why efficacy in-creased or decreased during a 1-year bachelor of edu-cation program that included a 36-hour mathematicsmethods course and 61 days of practice teaching inschools. Research in the area of teacher efficacy hasproduced a solid body of literature over the past 25years that focuses on how teachers judge their capabil-ity to bring about student learning (Bandura, 1986,1997; Gibson & Dembo, 1984; Goddard, Hoy, &Woolfolk Hoy, 2004; Ross, 1998; Tschannen-Moran& Woolfolk Hoy, 2001; Tschannen-Moran, WoolfolkHoy, & Hoy, 1998). The teacher assesses his or herability to perform a given task based on analysis ofwhat is required to accomplish the task, reflection onpast similar situations, and an assessment of the re-sources available (Tschannen-Moran, Woolfolk Hoy,& Hoy, 1998). Teachers with high self-efficacy aremore likely to experiment with effective but challeng-ing instructional strategies, such as performance-basedassessment (Vitali, 1993) and student-directed, activ-ity-based methods (Riggs & Enochs, 1990). Teacherefficacy is important, because it is a reliable predictorof student achievement (Mascall, 2003; Muijs &Reynolds, 2001) and of math reform implementation(evidence reviewed in Ross, 1998).

Teaching efficacy is particularly important in ele-mentary mathematics, because many of the candidatesdo not have strong mathematics backgrounds. Mostteacher candidates have experienced traditional pro-grams as students and have not observed, or partici-pated in, programs that are based on reformed mathprinciples and practices (Bruce, 2005). The Confer-ence Board of the Mathematical Sciences (1975) foundthat “Teachers are essentially teaching the same waythey were taught in school” (p. 77). More than 20 yearslater, “the average classroom showed little change.”(Hiebert, 1999, p. 11) The same method of teachingpersists today despite pressure to change (D’Am-brosio, Boone, & Harkness, 2004; Ross, McDougall,& Hogaboam-Gray, 2002).

Research design and data collection

In the sample study, I examined the teaching efficacyof 50 elementary preservice teachers, but the main goalof the study was to understand qualitatively the experi-ences of preservice teachers and examine what factorsinfluenced their efficacy ratings and why. For a singleresearcher, the initial sample size was too large forgathering rich qualitative data. Therefore, the partici-

pant sample was narrowed from 50 participants to 10 ina stratified random sample, and then finally to 2 partic-ipants as extreme sampling case studies.

Four purposes drove the preservice teacher efficacystudy. They were (a) to investigate preservice teachers’math stories of experience as sources of tension, “fail-ure,” and “success”; (b) to identify methods that con-tribute to preservice development of efficacy related tomathematics reform based teaching methods; (c) to in-form the teaching and learning strategies and theoreti-cal understandings of preservice elementary mathe-matics methods courses; and (d) to gather information onhow efficacy beliefs of preservice teachers changedthrough experience.

There were two phases in the study: a pilot phaseand a full study phase. I used the pilot phase (Year 1) tounderstand the scope of the situation. This phase sup-plied rich data demonstrating the value of continuingwith a full study (Year 2). During the pilot phase, pre-vious studies and existing theoretical frameworks werecarefully reviewed. The pilot phase thus acted as an in-forming agent where existing research, the preliminarydata collection, and analysis interacted to inform thefull study.

Several forms of data collection were used, includ-ing interviews, written log entries by participants, in-ventories, observations, and a teacher efficacy survey.In an attempt to demonstrate transparency of data col-lection methods, I have detailed the data collectionmethods and timing in Tables 1 and 2 to match the fourresearch purposes. Tables 1 and 2 make the data collec-tion methods transparent to the reader to meet two cri-teria posited by Lincoln and Guba (1985): trans-ferability and dependability. That is, the collectionmethods are described in sufficient detail that thereader can inspect the process, and are displayed withenough clarity that a researcher could (a) consider asimilar implementation sequence in another researchcontext or (b) use the tables as an example of depend-able reporting practices.

Grounded theory dilemmas

Grounded theory analysis procedures have been welldocumented in the methodology literature (Charmaz,1990; Creswell, 1998, 2005; Harry, Sturges, & Kling-er, 2005), and highlight the validity (and, some wouldargue, the objectivist underpinnings) of this researchmethod. For a full discussion on the terrain, evolution,and developments of grounded theory, see Greck-hamer and Koro-Ljungberg (2005) and Mills, Bonner,and Francis (2006). However, the challenges of datacollection and management in grounded theory studieshave not been fully identified and addressed. In this

International Journal of Qualitative Methods 6 (1) March 2007http://www.ualberta.ca/~ijqm/

Bruce INTERACTIONS WITH GT ANALYSIS 3

Coll

ecti

on

Met

hod

Part

icip

ant

Invo

lvem

ent

Spec

ific

Res

earc

hP

urp

ose

and

Addit

ional

Com

men

ts

Mat

hin

ven

tory

All

par

tici

pan

tsin

div

idual

lyco

mple

ted

anin

div

idual

ques

tionnai

rew

ith

eight

ques

tions

on

the

firs

tday

of

clas

s

Purp

ose

:(a

)T

oin

ves

tigat

epre

serv

ice

teac

her

s’m

ath

stori

esof

exper

ience

asa

sourc

esof

tensi

on,“f

ailu

re,”

and

“succ

ess”

Use

dfo

rbas

elin

edat

are

gar

din

gpri

or

exper

ience

san

dgen

eral

level

of

confi

den

cean

din

tere

stco

min

gin

toth

e

mat

hem

atic

sm

ethods

cours

e

Tea

cher

effi

cacy

scal

e

All

par

tici

pan

ts,in

div

idual

lyco

mple

ted

the

Tea

cher

’sse

nse

of

effi

cacy

scal

e

on

four

occ

asio

ns

Purp

ose

:(d

)T

ogat

her

info

rmat

ion

on

how

effi

cacy

bel

iefs

chan

ged

thro

ugh

exper

ience

for

pre

serv

ice

teac

her

s

Eff

icac

ysc

ale

was

use

dto

plo

tco

nfi

den

cech

anges

over

tim

e;sc

ale

was

also

use

dto

support

final

log

entr

yw

riti

ng

by

par

tici

pan

tsan

das

apoin

tof

dis

cuss

ion

duri

ng

one

toone

inte

rvie

ws

Dat

afr

om

inte

rvie

ws

gav

ein

sights

into

par

tici

pan

tin

terp

reta

tion

Mat

hlo

gs

All

par

tici

pan

ts,in

div

idual

lyco

mple

ted

mat

hlo

gen

trie

son

abiw

eekly

bas

is

duri

ng

the

cours

e

Purp

ose

:(a

)T

oin

ves

tigat

epre

serv

ice

teac

her

s’m

ath

stori

esof

exper

ience

asa

sourc

esof

tensi

on,“f

ailu

re”

and

“succ

ess”

(c)

To

info

rmth

ete

achin

gan

dle

arnin

gst

rate

gie

san

dth

eore

tica

lunder

stan

din

gs

of

pre

serv

ice

elem

enta

ry

mat

hem

atic

sm

ethods

cours

es

Par

tici

pan

tsw

ere

giv

ensp

ecif

icm

ath

log

pro

mpts

tow

rite

about;

the

final

log

pro

mpt

ism

ost

hel

pfu

las

itre

late

sto

self

-eff

icac

y

Cla

ssobse

rvat

ions

All

par

tici

pan

tsin

clas

sw

ere

obse

rved

whil

ew

ork

ing

insm

all

gro

ups/

Rec

ord

edfi

eld

note

s

Purp

ose

:(b

)T

oid

enti

fym

ethods

that

contr

ibute

topre

serv

ice

dev

elopm

ent

of

effi

cacy

rela

ted

tom

athem

atic

s

refo

rmbas

edte

achin

gm

ethods

(c)

To

info

rmth

ete

achin

gan

dle

arnin

gst

rate

gie

san

dth

eore

tica

lunder

stan

din

gs

of

pre

serv

ice

elem

enta

ry

mat

hem

atic

sm

ethods

cours

es

Par

tici

pan

tsw

ere

involv

edin

regula

rcl

ass

acti

vit

y;

rese

arch

erm

ade

obse

rvat

ions

of

studen

tsin

afi

eld

note

s

journ

alboth

duri

ng

and

afte

rea

chobse

rvat

ion

occ

asio

n

Focu

sgro

up

inte

rvie

ws

Par

tici

pan

tsm

etin

thre

egro

ups:

Gro

up

A,G

roup

B,G

roup

C

(18

par

tici

pan

ts—

Yea

r1

12

par

tici

pan

ts—

Yea

r2)

Dra

win

gof

imag

esof

self

asa

mat

h

teac

her

atonse

tof

mee

tings

Purp

ose

:(a

)T

oin

ves

tigat

epre

serv

ice

teac

her

s’m

ath

stori

esof

exper

ience

asa

sourc

esof

tensi

on,“f

ailu

re,”

and

“succ

ess”

(b)

To

iden

tify

met

hods

that

contr

ibute

topre

serv

ice

dev

elopm

ent

of

effi

cacy

rela

ted

tom

athem

atic

sre

form

bas

ed

teac

hin

gm

ethods

�F

or

2004:

Anal

ysi

sco

mple

ted

inS

pri

ng

2004;

pap

erw

ritt

en

�F

or

2005:

All

tran

scri

pts

com

ple

tefr

om

focu

sgro

up

inte

rvie

ws

foll

ow

ing

inte

rvie

ws

One-

to-o

ne

inte

rvie

ws

Sel

ecte

dpar

tici

pan

tsm

etone-

to-o

ne

wit

hre

sear

cher

for

a75

min

ute

inte

rvie

wat

the

end

of

the

pro

gra

m

(5in

terv

iew

s—Y

ear

1,10

inte

rvie

ws—

Yea

r2;

2of

whic

hw

ere

exte

nded

inle

ngth

and

det

ail)

(a)

To

inves

tigat

epre

serv

ice

teac

her

s’m

ath

stori

esof

exper

ience

asa

sourc

esof

tensi

on,“f

ailu

re,”

and

“succ

ess”

(b)

To

iden

tify

met

hods

that

contr

ibute

topre

serv

ice

dev

elopm

ent

of

effi

cacy

rela

ted

tom

athem

atic

sre

form

bas

ed

teac

hin

gm

ethods

(c)

To

info

rmth

ete

achin

gan

dle

arnin

gst

rate

gie

san

dth

eore

tica

lunder

stan

din

gs

of

pre

serv

ice

elem

enta

ry

mat

hem

atic

sm

ethods

cours

e

�F

ive

inte

rvie

ws

conduct

edin

Yea

r1;

10

inte

rvie

ws

conduct

edin

Yea

r2

�D

etai

led

info

rmat

ion

but

not

quit

eas

rich

asfo

cus

gro

up

inte

rvie

ws

Table

1.

Data

coll

ecti

on

met

hods

and

purp

ose

s

study, there were three principal data dilemmas thatemerged for me as a researcher and are discussed in de-tail in this paper.

How can I begin to collect data in an inductivemanner? If there is no theoretical framework or“hunch” to begin with, how would I know what to col-lect? There are criticisms that grounded theory doesnot incorporate reference to, and use of, existing theo-retical frameworks. However, according to Berg(2001), Charmaz (2003), and Strauss (1987), groundedtheory is not an entirely inductive process. Interplaybetween experience, induction, and deduction are re-quired. This presents a fundamental methodologicaldilemma. Is it possible for research to be both planfuland emergent?

What is the nature of the interaction between datacollection and analysis? The iterative process of col-lection and analysis is alluded to in most grounded the-ory studies because the analysis of specific data shouldinform the following round of data collection, but howis this actually done? Furthermore, how does the re-searcher describe this phenomenon in reports?

How much data collection is required in agrounded theory study? In a grounded theory study,the volume of data can quickly become overwhelming(see Harry et al., 2005, for example) because of the in-ductive nature of the study. The importance of satura-tion of codes is what sometimes drives overcollectionof data, because, although the study is qualitative in na-ture, the researcher wants to be confident that thethemes identified are exhaustive and/or valid. Thisleads to the danger of gathering wide, but shallow,pools of data. The objective, however, is to gather richdata that do not deplete or overconsume valuable re-search resources.

These three methodological dilemmas or tensionsare deeply interconnected. They require further elabo-ration and discussion both related to the sample studyand in the qualitative research literature overall.

Can grounded theory studies be bothplanful and emergent?

A key characteristic of grounded theory is the use of anemergent design. This is defined by Creswell (2005) as

International Journal of Qualitative Methods 6 (1) March 2007

http://www.ualberta.ca/~ijqm/

Bruce INTERACTIONS WITH GT ANALYSIS 5

Timing of Collection Collection Method Nature of Collection Episode

September

First class in September, Year 1

First class in September, Year 2

Math inventory Single collection episode

September, October, March,

February

Early September, Year 2

Mid October, Year 2

Mid of February, Year 2

Teacher efficacy scale Punctuated collection episodes across 4 months

September through December Class observations Ongoing collection; 8 occasions of 2 hours each

Mid September – Mid December

Year 2

September through March

Mid September – End of March

Years 1 and 2

Math logs Ongoing collection; biweekly

November

Early November, Year 1

Early November, Year 2

Focus group interviews Single collection episode for each of the 3 focus groups

May

May, Year 1

May, Year 2

One-to-one interviews Single collection episode for each participant

Table 2. Timing of data collection

a process whereby “the researcher collects data, ana-lyzes it immediately rather than waiting until all dataare collected, and then bases the decision about whatdata to collect next on this analysis” (p. 405). The issueof practicality surfaces quickly. To collect data, andthen analyze it before continuing with further collec-tion, the overall research period must be extensive. Theanalysis period takes up time that might otherwise beused for additional data collection. In the sample study,the conditions were optimal, with distinct “on” and“off” data collection and analysis periods: Periods ofmethods course activity were followed by periods ofclassroom placement activity, allowing for pauses indata collection and time for data analysis. Whethertime periods are restricted or vast, research studies can-not theoretically or practically be launched withouthunches or a framework. The framework might not beexplicit, but the researcher begins, as a minimum, withobjectives based on prior experiences. This realityclearly challenges any pure induction claims. Issues ofemergent design were confronted at most stages of datacollection and analysis in the preservice teacher effi-cacy study: As a “responsible” researcher, I had theo-retical understandings and data collection methodsclearly detailed from the outset of the study, which wasin direct competition with my goal of responding to theanalysis of each data collection episode by refining thesubsequent collection methods and/or episodes.

The educational researcher cannot begin to collectdata for publication purposes without first undergoingethical review. The ethical review process for thepreservice teacher efficacy study was stringent, withmore than 30 pages of documentation and five ethicalresearch committee deliberation meetings. Letters toparticipants, interview outlines, surveys, and scaleswere all under close scrutiny because of my strong par-ticipatory role. Typical to many ethical reviews, a sum-mary of the related literature was required. Timelinesand dissemination plans were also demanded. Givenall of the requirements of ethical reviews for educationresearch studies, the ability to use a pure emergent de-sign is considerably compromised. Qualitative studies,however, value an emergent unfolding of the research.Can the researcher plan for emergence?

In allowing for emergence, the most effective strat-egy employed in this efficacy study was the use of anextended timeline allowing for two phases: a pilot yearand a full-year study. The pilot phase of data collectionfunctioned as both a method informant and as a find-ings informant. From the data collected in the pilotphase, both the methods for data collection and the fo-cus of collection were refined. This strategy supportsthe constructivist principle of constructivist grounded

theory; that is, there is a full acknowledgement that I asresearcher was interacting with participants, the data,and the literature as the study was co-constructed.

The simple event of one-to-one interviews is usedhere to illustrate how data collection and analysis in-formed the subsequent collection activity in an emer-gent design. Interviews with preservice teachers wereheld at the end of the program. The questions, althoughdrafted, were not finalized until observation and focusgroup data were collected and analyzed. Originally,Question 3 was What elements of your classroomplacement(s) were helpful in building confidence inyour math teaching? This question was revised onanalysis of data in the pilot phase, which indicatedmixed experiences on classroom placements, leadingto increased efficacy for some participants but de-creased efficacy for others. The original question wasexpanded to two questions for the final interview, al-lowing for a range of responses: (a) What features ofthe program over this past year have helped you buildconfidence teaching math? and (b) What features of theprogram have hindered your confidence teachingmathematics?

A broader example of data collection emergence re-lates to timing of data collection. Originally, the indi-vidual interviews were to be conducted at the end of themethods course (February). Once I clearly understoodthe impact of classroom placements, based on analysisof focus group interviews, however, the timing of in-terviews was moved to April, when participants wouldhave completed an additional placement. This provedto be fruitful for gathering a more encompassing storyof preservice teacher experiences.

A final example of planning for emergence illus-trates how theoretical frameworks and related researchinstruments influenced the refinement of collectionstrategies. In the pilot year, a simple survey was admin-istered three times. The strategy yielded high-qualityresults, but the instrument was flawed, in that it did notaccount for many areas of efficacy information. In thefull study, the original survey was substituted with onedeveloped and tested by other researchers. It was notuntil the literature review was extended to examineteacher efficacy studies based on quantitative data thatthe value of the Teacher’s Sense of Efficacy Scale wasconsidered. The 12-item instrument (Tschannen-Moran & Woolfolk Hoy, 2001) was chosen because ofits demonstrated reliability and validity. The qualita-tive examination of the Efficacy Scale led to furtherunderstanding of exactly how items on the scale werebeing interpreted by participants and what factors in-fluenced depression or elevation of scores.

International Journal of Qualitative Methods 6 (1) March 2007http://www.ualberta.ca/~ijqm/

6 Bruce INTERACTIONS WITH GT ANALYSIS

How do data collectionand analysis interact?

Some data analysis procedures have been well docu-mented in the methodology literature (Charmaz, 1990;Creswell, 1998 , 2005; Denzin & Lincoln, 2003; Harryet al., 2005; Seale, 1999), particularly in terms of cod-ing processes, to illustrate the validity of grounded the-ory as a research method and the credibility offindings. Charmaz (2003) clearly described the theo-retical understanding of data collection and analysis in-teraction.

Essentially, grounded theory methods consist ofsystematic inductive guidelines for collectingand analyzing data to build middle-range theo-retical frameworks that explain the collecteddata. Throughout the research process,grounded theorists develop analytic interpreta-tions of their data to focus further data collec-

tion, which they use in turn to inform and refinetheir developing theoretical analyses. (p. 250)

Charmaz also described common analytic steps ofgrounded theory studies.

Less transparent are the details of how data collec-tion and analysis interact in a practical sense. In this ef-ficacy study, for example, the data collection and earlyanalysis structure was distinct from the final detailedanalysis structure. Realities of working as a single re-searcher (with one research assistant) have an enor-mous impact on the ability to manage large quantitiesof data with integrity. Thus, the collection and earlyanalysis stage consisted of a funneling strategy that be-gan with 50 participants, then decreased to 10, and fi-nally reduced again to 2 participants. The purpose ofthe funneling strategy was to gain increasing depth ofunderstanding about the details of participant experi-ence. The funneling strategy drove data collection pro-cesses as well as the early analysis stages. This wasdistinct from the final detailed data analysis strategy, in

International Journal of Qualitative Methods 6 (1) March 2007http://www.ualberta.ca/~ijqm/

Bruce INTERACTIONS WITH GT ANALYSIS 7

Figure 1: Model for enhancing integrity of participant sample during data collection and analysis

For a single researcher, sample size might be dictated by the ability to manage data collection and analysis with integrity. The above

model was used to examine preservice teacher efficacy and factors that influenced efficacy ratings during an intense 1-year program. The

funnel strategy was used for data collection and initial analysis with greater detail gathered as the number of participants decreased. The

hourglass strategy was used during the final analysis stage beginning with open and active coding of data from 50 participants, which re-

fined and expanded the codes generated during the funneling stage. The analysis was then narrowed to only 2 participants who repre-

sented extreme cases, one participant who had a negative experience and one extremely positive case. This ensured that the analysis did

not exclude noncongruence or contradictions in the findings. Finally, I conducted a cross-case analysis of 10 participants to confirm in-

terpretations, thus increasing transferability of findings.

which I began with the same 50 participants, then nar-rowed to investigate the 2 extreme case samples, andfinally expanded the analysis to 10 participants in anhourglass strategy (see Figure 1). Using the hourglassstrategy, I reexamined data from all 50 participants toincrease my confidence that a full accounting of codeswas complete. I was concerned that codes generated inthe initial analysis might not have been comprehensivebecause of additional themes generated as analysiscontinued. I then focused in on 2 extreme cases for afull analysis based on all codes generated. The analysisout to 10 representative cases then confirmed that the 2cases selected were the outlying cases, and that partici-pants followed similar trajectories during theirpreservice year. This increased my confidence in thetheorized learning trajectory as a valid overall descrip-tion of experiences for all participants.

Why is this distinction between the funnel and thehourglass important? Beyond the increased transpar-ency of declaring the methods employed, the distinc-tion between the two strategies is critical, particularlyin terms of credibility of findings. The funnel strategyis typical in qualitative and mixed methods studies fornarrowing the number of participants and managing re-lated data while maintaining integrity of the sample.However the hourglass strategy for analysis is notcommonly described in the literature. As a researcherwith few resources, I found that use of the hourglassstrategy was manageable and effective.

Examining outlying cases is an important strategyfor helping to ensure that norms, themes, or typical pat-terns of activity do not monopolize analysis to the pointof ignoring atypical patterns or contradictory findings.Use of the hourglass strategy has the potential to in-crease researcher and reader confidence that model de-velopment and theory building are substantive. In thisstudy, it was important to expand the analysis back outto 10 participants for three reasons: (a) to look for con-tradictions and noncongruence of findings; (b) to de-termine whether the two cases were, indeed, theextremes; and (c) to determine whether there weresame, similar, or different patterns across a larger sam-ple of participants leading to researcher confidence inthe development of a working model.

Research reports that explicitly describe or depictcollection and analysis strategies can be more convinc-ing, yet written reports are expected to adhere to veryspecific guidelines, such as short text length while si-multaneously describing complex ideas. Charmaz(1994) and Larson (1997) explicitly excluded the useof diagrams and figures in an attempt to avoidovergeneralizations in grounded theory studies. As adeparture from this position, I found that annotated di-agrams of data collection and analysis processes of-

fered a helpful visual depiction of some of the com-plexities of the data collection and analysis interactionprocesses.

Interaction between collection and analysis are ob-viously not fully accounted for in Figure 1. Figures 2and 3 further illustrate the zigzag data collection andanalysis approach used in Year 1 (pilot) and Year 2 ofthe study (see also Creswell, 2005). It is evident fromthese figures that each stage of data collection was fol-lowed by a period of analysis that subsequently in-formed the following data collection stage.

The purpose of including the diagrams is to explic-itly illustrate the practical interaction between data col-lection and analysis, yet even detailed diagrams do notcapture the entire process. Member checks, which oc-curred after each major stage of analysis, and partici-pant coding of transcripts (interrater checks on openand active coding) are two examples of methods notaccounted for in the diagrams. (One advantage of asmaller study is the ability to follow up on memberchecks that actively seek contradictions rather thansimply moving the researcher agenda along.) The ar-rows between columns in Figures 2 and 3 illustrate aflow back and forth between collection and analysis. Inreality, analysis was also occurring during the data col-lection events, as I made intuitive leaps. With the short-comings of diagrams in mind, the use of prosestatements or annotations is essential to enhance detailand complexities of data collection, analysis, and trian-gulation to further increase transparency. The dangerof oversimplifying the method, even through detaileddescriptions and diagrams, is an ongoing tension that Ifaced during the study.

What constitutes sufficient data?

The answer to this question is elusive. On investigationof existing grounded theory studies and this study, ithas become clear to me that there is no magic numberof events or types or periods of data collection. Harry etal. (2005) have suggested that they could have com-pleted their large-scale study more successfully withapproximately half the amount of data gathered fromparticipants (272 open-ended interviews, 84 informalconversations, 627 classroom observations, 42 childstudy meetings, and related documents). The pre-service teacher efficacy study was much smaller andfine-grained in comparison, and it quickly became ap-parent that gathering qualitative data with 50 partici-pants would be impossible given the limited resourcesI had. Furthermore, the purpose of the study was to ex-amine carefully the particulars of participant experi-ence and the details of how their efficacy developedwhile learning to teach mathematics. The funneling

International Journal of Qualitative Methods 6 (1) March 2007http://www.ualberta.ca/~ijqm/

8 Bruce INTERACTIONS WITH GT ANALYSIS

strategy combined with purposeful sampling (stratifiedrandom sample) helped to narrow participant involve-ment to a manageable amount while maintaining integ-rity of the sample.

The pilot phase also yielded important informationabout group size for focus group interviews. The pilotphase included groups of six preservice teachers for fo-cus groups in which each participant competed for timeto express his or her views, and there was a lack of sus-

tained storytelling. Therefore, in the full study, the fo-cus groups had a membership of between 3 and 5 par-ticipants. Although this is a much smaller group sizethan the 6 to 8 recommended by Kruger (1998), itproved to be more effective for purposeful interactioncombined with opportunities for each participant tospeak at length about their experiences.

The number of data “events” in a grounded theorystudy depends on a matrix of factors, including scope

International Journal of Qualitative Methods 6 (1) March 2007

http://www.ualberta.ca/~ijqm/

Bruce INTERACTIONS WITH GT ANALYSIS 9

In Figures 2 and 3, I illustrate the sequence of “events” in terms of how data collection and analysis were conducted over 2 years. Alsomade transparent in the figures are the distinctions and refinements made from the pilot phase to the full study phase. For example, thenumber of focus group interview participants was reduced in the full study, as group size was too large in the pilot phase, compromisingthe quality of the stories of experience shared by participants. Member checks and independent coding were completed by the researcher

Figure 2. Diagram of zigzag approach for Year 1 (pilot year)

Figure 3. Diagram of zigzag approach for Year 2

and scale of the study, level of detail required, re-sources available, topic, and research questions. Isthere a minimum number of data events required?Typically, researchers ensure credibility through trian-gulation of data (drawing on multiple sources of infor-mation, individuals, or processes) and/or memberchecking (asking participants to check the accuracy ofaccounts) (Creswell, 2005). Ultimately, the number ofdata events is less important than the trustworthiness ofthe reporting.

The essence of the data quantity issue lies, perhaps,in the saturation of data. Once an exhaustive listing ofcodes, categories, or themes has been achieved, and noother codes are identifiable, the researcher can be as-sured to the greatest extent possible that sufficient datahave been collected and that new data will not provideadditional information or interpretation. The problemfor the researcher is that it is difficult to know whetherthe codes have been saturated (and sufficient data havebeen gathered) until the data have been analyzed andthe study is complete. Many researchers are led to oneof three solutions: (a) the researcher must collect moredata than necessary to ensure saturation of codes,(b) he or she must return to the study site and collectmore data after the analysis phase is “complete,” or(c) he or she conducts preliminary coding of datawhenever possible during the data collection process.None of these solutions is ideal in terms of use of re-search resources or method. In the ideal circumstances,the underlying structure of moving back and forth be-tween data collection and analysis is accounted for andplanned. In the example study, the ideal data collectionconditions were available. The rhythm of the bachelorof education program itself, combined with the oppor-tunity to conduct a pilot phase and actual study phase,was optimal for data collection, analyzing data andthen refining the following round of data collection.These conditions are likely more the exception than thenorm.

The tension between the value of the specifics of afine-grained qualitative study and studies with widerdata collection is at the heart of grounded theory dilem-mas. How does the researcher make claims about newtheories that are convincing to the research communitywhile investigating the particulars of experience? It isthis balance between richness (and detail in the particu-lars of the findings) and application to broader modelsand theory building that continue to plague groundedtheory research studies. In the efficacy study, the hour-glass approach to data collection and analysis, as wellas implementing the study over two phases, facilitateda balance that eased some of these tensions.

Discussion

Grounded theory studies are “grounded” in the datacollected to develop or refine models of understandingthrough an inductive process. There is an assumptionthat the researcher approaches the study in a state ofneutrality and has the important role of describing thesituation in a non-evaluative way, so that participantvoice is valued over that of the researcher. Over theyears, in educational research, however, there has beena tendency to be evaluative in efforts to identify “mosteffective practices” in educational settings. The re-searcher is already familiar with the theoretical frame-works and literature of the field, and carries a personalprofessional definition of what constitutes most andleast effective practices. To position grounded theory,or perhaps any method in the educational research con-text, as entirely inductive is an overgeneralization. Inreality, there is a give-and-take relationship betweenthe researcher and the situation being investigated. Theresearcher becomes interested in a situation. She or hebegins to read more about similar situations. Further-more, the researcher would likely want to ensure thatthe study is worthwhile in terms of value to the re-search and educational communities, particularly as itrelates to funding and enabling a sense agency from thefindings. Before the research begins, the researchermust prepare proposals, undergo ethical reviews, andinvite participants. In other words, researchers must bestrategic. The theorizing has already begun. The valueof prior experiences must be acknowledged and explic-itly referenced. This underlines and makes possible thelevel of complexity of understanding essential to edu-cational research. Induction and prior theory examina-tion must be seen not as oppositional but ascomplementary. A researcher can be engaged in a situ-ation both theoretically and practically and still exam-ine the data using multiple lenses and interpretations.Indeed researchers theorize on an ongoing basis. Ethi-cal review procedures cause additional complicationsto emergent design studies. The details of data collec-tion methods must be disclosed for ethical review.Problems can arise when the researcher recognizes thatprocedural changes are required. Using a qualitativeemergent design such as constructivist grounded the-ory is more time consuming and labor intensive, andrequires specific conditions for success. Indeed, re-searchers might be discouraged from using this methodin some cases.

Those grounded theory studies that occur in phasesallow for a higher level of emergence not afforded inshorter studies. The researcher tests out data collectionmethods for two purposes: first, to see whether the data

International Journal of Qualitative Methods 6 (1) March 2007

http://www.ualberta.ca/~ijqm/

10 Bruce INTERACTIONS WITH GT ANALYSIS

yield is rich; and second, to begin the analysis of earlydata to inform future phases of study. Theorizing thenfinds a place within the literature and within the datacollection phases.

In response to the call for transparency of methods,qualitative reports must make every effort to describethe details of collection and analysis so that the readerbelieves the study to be credible, transferable, and de-pendable, whether claims are made about gener-alizability or not. “Such efforts at transparency willmake our work more accessible to others, and theirsubsequent judgments will ultimately be of benefit tous” (Demerath, 2006, p. 104). One strategy for enhanc-ing transparency, in light of limited space requirementsof reports, is the use of annotated diagrams providingthe reader with graphically explicit images of the datacollection and analysis procedures. No form of de-scription captures the process entirely, and by simplyacknowledging this shortcoming, the reader mighthave a more realistic sense of the complexities of theprocesses.

Optimal conditions for aconstructivist grounded theory study

There appear to be specific conditions under which aconstructivist grounded theory study is possible andappropriate. These include obvious but essential condi-tions described by methods experts to date (conditions1, 2, 3, and 6, below), and conditions that have beenless clearly articulated in the literature but require fur-ther consideration (particularly conditions 4, and 5, 7).

1. The purpose of the study is to examine andexplain an educational process qualitatively,usually in situations where the researchliterature is limited and theories requiregeneration or refinement (Glaser & Strauss,1967).

2. The timelines of the study are organized to ensurethat periods of data collection can be followeddirectly by analysis prior to the following datacollection period. This ensures that appropriatetypes and amounts of data are collectedeffectively and lead to saturation withoutdepleting resources unnecessarily (Creswell,2005).

3. Extensive pilot work can be extremely importantto an effective study. According to Wholey(1986), it takes the researcher up to one third ofthe full research timeframe to evaluate bestapproaches for collection and analysis methods.The time available for the study should be

sufficient to ensure that this “acclimatization”can occur.

4. To increase credibility, a portion of data is“reserved” and used to confirm interpretationsof detailed findings. Models such as thehourglass strategy, developed in the examplestudy, are effective in facilitating this process.

5. To increase dependability and transferability, anaudit trail of data collection and analysismethods is explicitly made available for readerinspection. Tables and figures combined withrich descriptions can support this process. Thetable of data collection methods and purposes,as well as the zigzag data collection and analysisfigures provided in the sample study, areexamples of how dependability andtransferability may be increased.

6. The researcher understands and ensures that thestudy is thoroughly grounded in the data whilebeing informed by existing theoreticalframeworks and research literature, andtherefore does not make claims of using anentirely inductive process (Charmaz, 2003).

7. The researcher is prepared to engage in apredominantly emergent process where thethemes emerge from the data leading to amiddle-range theory or working model, whileacknowledging and managing the extensivepreplanning required for successful datacollection and analysis that meets ethicsdemands and uses limited resources effectively.

Paradigms of emergence, induction and deduction, andqualitative and quantitative traditions are distinguishedfrom one another in educational research to clarify pur-poses, methods, and perspectives. In this article, I havequestioned the extremist positioning of reported stud-ies within a given paradigm, which leads to oversim-plification and underrepresentation of data collection,management, and analysis methods in educational re-search. Transparency and clarity of methods used, aswell as an acknowledgement of tensions faced in rela-tion to method, are essential ingredients to increasingour understanding of complexities and ensuring integ-rity of conclusions.

This article is a response to the call for greater com-munication and transparency of qualitative researchmethods, particularly in illustrating ways of managingdata collection and the subsequent relationship to anal-ysis in grounded theory studies. Tensions of attempt-ing to use a predominantly emergent design weresufficiently reconciled during the sample study usingphases of implementation and acknowledging the real-istic constraints that are placed on emergence in educa-

International Journal of Qualitative Methods 6 (1) March 2007http://www.ualberta.ca/~ijqm/

Bruce INTERACTIONS WITH GT ANALYSIS 11

tional research. In this article, I have offered somepractical suggestions for conducting small-scalegrounded theory studies that are transparent, credible,transferable, and dependable.

References

Anfara, V., Brown, K., & Mangione, T. (2002). Qualitative analy-sis on stage: Making the research process more public. Educa-tional Researcher, 31(7), 28-36.

Bandura, A. (1986). Social foundations of thought and action: Asocial cognitive theory. Englewood Cliffs, NJ: Prentice-Hall.

Bandura, A. (1997). Self-efficacy: The exercise of control. NewYork: W. H. Freeman.

Berg, B. L. (2001). Qualitative research methods for the social sci-ences. Needham Heights, MA: Pearson.

Bruce, C. (2005). Teacher candidate efficacy in mathematics: Fac-tors that facilitate increased efficacy. In G. A. Lloyd, S. Wilson,J. L. M. Wilkins, & S. L. Behm (Eds.), Proceedings of theTwenty-Seventh Psychology of Mathematics Association-NorthAmerica [CD-ROM]. Eugene, OR: All Academic.

Charmaz, K. (1990) Discovering chronic illness: Using groundedtheory. Social Science & Medicine, 30, 1161-1172.

Charmaz, K. (1994) Identity dilemmas of chronically ill men. Soci-ology Quarterly, 35, 269-288.

Charmaz, K. (2003) Grounded theory: Objectivist andconstructivist methods. In N. K. Denzin & Y. Lincoln (Eds.),Strategies of qualitative inquiry (pp. 249-291). Thousand Oaks,CA: Sage.

Conference Board of the Mathematical Sciences. (1975). Overviewand analysis of school mathematics, Grades K-12. WashingtonDC: National Advisory Committee on Mathematical Education.

Creswell, J. (1998). Qualitative inquiry and research design:Choosing among five traditions. Thousand Oaks, CA: Sage.

Creswell, J. (2005). Educational research: Planning, conducting,and evaluating qualitative research. Upper Saddle River, NJ:Merrill Prentice Hall Pearson Education.

D’Ambrosio, B., Boone, W., & Harkness, S. (2004) Planning dis-trict-wide professional development: Insights gained from teach-ers and students regarding mathematics teaching in a large urbandistrict. School Science & Mathematics, 104(1), 5-15.

Demerath, P. (2006) The science of context: Modes of response forqualitative researchers in education. International Journal ofQualitative Studies in Education, 19(1), 97-113.

Denzin, N. K., & Lincoln, Y. S. (Eds.). (2003). Landscape series ofthe handbook of qualitative research. Thousand Oaks, CA: Sage.

Gibson, S., & Dembo, M. (1984). Teacher efficacy: A constructvalidation. Journal of Educational Psychology, 76(4), 569-582.

Glaser, B., & Strauss, A. (1967). The discovery of grounded theory.Chicago: Aldine.

Goddard, R., Hoy, W., & Woolfolk Hoy, A. (2004). Collective effi-cacy beliefs: Theoretical developments, empirical evidence, andfuture directions. Educational Researcher, 33(3), 3-13.

Greckhamer, T., & Koro-Ljungberg, M. (2005). The erosion of amethod: Examples from grounded theory. International Journalof Qualitative Studies in Education, 18(6), 729-750.

Harry, B., Sturges, K., & Klinger, J. (2005). Mapping the process:An exemplar of process and challenge in grounded theory analy-sis. Educational Researcher, 34(2), 3-13.

Hiebert, J. (1999) Relationships between research and the NCTMstandards. Journal for Research in Mathematics Education,30(1), 3-19.

Kruger, R. (1998). Analyzing and reporting focus group results.Thousand Oaks, CA: Sage.

Larson, B. W. (1997). Social studies teachers/ conceptions of dis-cussion: A grounded theory study. Theory and Research in So-cial Education, 25, 114-146.

Lester, F. K. (1996). Criteria to evaluate research. Journal of Re-search in Mathematics Education, 27(2), 130-132.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. BeverlyHills, CA: Sage.

Mascall , B. (2003). Leaders helping teachers helping students:The role of transformational leaders in building teacher efficacyto improve student achievement. Unpublished doctoral disserta-tion, University of Toronto, Toronto, Canada.

Mills, J., Bonner, A., & Francis, K. (2006). The development ofconstructivist grounded theory. International Journal of Qualita-tive Methods, 5(1), Article 3. Retrieved June 19, 2006, fromhttp://www.ualberta.ca/~iiqm/backissues/5_1/html/mills.htm

Muijs, D., & Reynolds, D. (2001, April). Being or doing: The roleof teacher behaviors and beliefs in school and teacher effective-ness in mathematics, a SEM analysis. Paper presented at the An-nual Meeting of the American Educational ResearchAssociation, Seattle.

Riggs, I., & Enochs, L. (1990). Toward the development of an ele-mentary teacher’s science teaching efficacy belief instrument.Science Education, 74(6), 625-638.

Ross, J. A. (1998). The antecedents and consequences of teacherefficacy. In J. Brophy (Ed.), Research on teaching (Vol. 7,pp. 49-74). Greenwich, CT: JAI.

Ross, J., McDougall, D., & Hogaboam-Gray, A. (2002). Researchon reform in mathematics education, 1993-2000. Alberta Journalof Educational Research, 48(2), 122-138.

Seale, C. (1999). The quality of qualitative research. ThousandOaks, CA: Sage.

Strauss, A. (1987). Qualitative analysis for social scientists. NewYork: Cambridge University Press.

Tschannen-Moran, M., & Woolfolk Hoy, A. (2001). Teacher effi-cacy: Capturing an elusive construct. Teaching and Teacher Ed-ucation, 17, 783-805.

Tschannen-Moran, M., Wolfolk Hoy, A., & Hoy, W. (1998)Teacher efficacy: Its meaning and measure. Review of Educa-tional Research, 68(2), 202-248.

Vitali, G. (1993). Factors influencing teachers’ assessment and in-structional practices in an assessment driven educational re-form. Unpublished doctoral dissertation, University of Kentucky,Lexington.

Wholey, J. (1986). Evaluability assessment: Developing programtheory. In L. Bickman (Ed.), New directions for program evalua-tion (pp. 77-92). San Francisco: Jossey-Bass.

International Journal of Qualitative Methods 6 (1) March 2007

http://www.ualberta.ca/~ijqm/

12 Bruce INTERACTIONS WITH GT ANALYSIS