A content analysis of critical thinking....pdf

Embed Size (px)

Citation preview

  • Leah E. Wickersham, Texas A&M UniversityCommerce, PO Box 3011, Department of Secondary and Higher Educa-

    tion, Commerce, TX 75429. Telephone: (903) 468-3248. E-mail: [email protected]

    The Quarterly Review of Distance Education, Volume 7(2), 2006, pp. 185193 ISSN 1528-3518

    Copyright 2006 Information Age Publishing, Inc. All rights of reproduction in any form reserved.

    A CONTENT ANALYSIS OF CRITICAL

    THINKING SKILLS AS AN INDICATOR OF

    QUALITY OF ONLINE DISCUSSION IN

    VIRTUAL LEARNING COMMUNITIES

    Leah E. Wickersham

    Texas A&M UniversityCommerce

    Kim E. Dooley

    Texas A&M University

    Online discussion is a common tool to create learner-learner interaction. Whole class discussions can result in

    potentially hundreds of postings with students spending more time creating the illusion of participation as

    opposed to critical reflection and deeper learning. The purpose of this study was to determine quality of online

    discussion based on critical thinking constructs when learners were placed in smaller learning communities

    and not exposed to whole class discussion. The researchers sought to determine if discrepancies among the

    groups would exist and if students would receive the full benefit of learner-learner interaction by placing them

    in smaller groups.

    INTRODUCTION

    Online courses are not so dissimilar from

    their face-to-face counterparts, with interac-

    tive discussion playing an integral role in the

    teaching and learning process. Several advan-

    tages to the asynchronous approach not

    present in the synchronous environment

    involve providing all students the ability to

    interact and participate in the discussion, to

    learn at their own pace, and to have more

    time to reflect and respond within the

    expanded timeframe.

    New challenges, however, have emerged

    as a result of moving the discussion from the

    traditional to the virtual environment. More

    responsibility is placed on the student to self-

    engage in this process and the instructor is

    faced with the task of analyzing quality of

    participation and measuring student learning.

  • 186 The Quarterly Review of Distance Education Vol. 7, No. 2, 2006

    Moores editorial in the American Journal

    of Distance Education (1989) identified three

    types of interaction in distance education:

    learner-content, learner-instructor, and

    learner-learner. Interaction between learner

    and content implies that construction of

    knowledge occurs when the learner interacts

    with the course content and changes in ones

    understanding occurs when the new knowl-

    edge is combined with preexisting knowledge.

    Interaction between the learner and instructor

    reinforces the learner-content interaction using

    engagement and dialog exchange to promote

    the learning process with explanation, discus-

    sion, examples, and/or application activities.

    Interaction between learner and learner is

    essential in distance education if participation

    in class discussions is to take place. This inter-

    action can happen one-on-one or within a

    group setting, depending on the design of the

    course.

    Northrup (2002) studied online learners

    preferences for interaction or engagement in

    learning. She found four variables that serve an

    indicators of interaction: (1) content interac-

    tion including the structure, pacing, and use of

    various delivery technologies and interactive

    strategies; (2) conversation and collaboration

    through peer interaction and participation in a

    learning community; (3) intrapersonal/meta-

    cognitive interaction through self-monitoring

    and providing cognitive strategies like note-

    taking guides and advanced organizers; and (4)

    support through mentoring, tutorials, and

    timely correspondence with the instructor.

    One way to promote interaction and collab-

    oration is through online discussions. How-

    ever, determining the quality of discussion and

    amount of participation of students in a course

    can be cumbersome to measure. For example,

    a course with an enrollment of 30 students and

    a requirement to read and respond to all post-

    ings can be viewed as busy work with very

    little meaningful discussion and learning tak-

    ing place. It is often confusing and time con-

    suming to sift through what could result in

    hundreds of postings in an online discussion.

    More time and effort is spent on creating an

    illusion of participation on the part of the stu-

    dent by the number of one to two sentence

    postings to many discussion threads rather

    than an in-depth, meaningful discussion

    among a few, resulting in a failure to achieve

    what the instructor had intendedthoughtful

    reflection and meaningful discussions.

    If a students grades are partially deter-

    mined by their participation in online discus-

    sions, instructors face another challenge in

    determining how to assess the quality of dis-

    cussion and if in fact learning is taking place.

    But what tools are available to determine qual-

    ity in online discussions and whether higher-

    order thinking skills are being developed?

    THEORETICAL FRAMEWORK

    Virtual learning communities promote learn-

    ing when instruction, social interaction, and

    technology activities are present (Tu & Corry,

    2002). Research has shown that online discus-

    sion helps students understand course objec-

    tives, provides real world applications, and

    promotes interaction (Edelstein & Edwards,

    2003; Palloff & Pratt, 1999; Simonson, Smal-

    dino, Albright, & Zvacek, 2003). It is assumed

    that the facilitator or course instructor will

    consider how much time (or number of post-

    ings) a student needs to participate, but deter-

    mining how the discussions contribute to the

    achievement of the course objectives, how

    well the student is performing in the course,

    and how much learning is happening as a result

    of the discussion is another story. Edelstein

    and Edwards (2003) created an assessment

    rubric for student participation in threaded dis-

    cussions based on five constructs: (1) prompt-

    ness and initiative, demonstrating self-

    motivation and consistent engagement in the

    course content; (2) attention to detail in the

    delivery of the post (such as grammar and

    spelling); (3) relevance of the post in relation

    to the course topic and objectives; (4) how well

    opinions and ideas are expressed within the

    post; and (5) contribution to the learning com-

    munity. Although this rubric is useful in deter-

  • A Content Analysis of Critical Thinking Skills 187

    mining participation beyond just counting

    postings, it does not provide an indication of

    the depth of understanding of the content in

    terms of metacognition, problem solving, and

    critical thinking.

    Several researchers have used various mod-

    els to measure intellectual development and

    critical thinking within online discussions

    (Marra, 2002; Marra, Moore, & Klimczak,

    2004; Newman, Webb, & Cochrane, 1996;

    Visser, Visser, & Schlosser, 2003). Marra

    (2002) suggests that as complex understand-

    ings develop, learners are able to see knowl-

    edge as being defined and shaped by the

    context in which it must be applied (p. 16).

    Marra provides a rubric for determining if con-

    cepts discussed using online conversation tools

    are descriptive of the content domain, if they

    are embedded and interconnected, and if links

    are descriptive and efficient (2002). She posits

    that effective online learning environment

    should scaffold and support complex intellec-

    tual development.

    The model used by Newman et al. was

    based on Garrisons five stages of critical

    thinking (1992) and Henris cognitive skills

    needed in computer mediated communication

    (1992). Newman et al. (1996) cite Mason

    (1992) regarding how instructors rely on

    counting messages and logons to determine

    participation in threaded discussions with little

    thought of what constitutes good work or the

    quality of student learning. These authors sup-

    port the use of collaborative learning, critical

    thinking, and deep understanding of course

    material based on content analysis of the writ-

    ten narrative of the online discussion as

    another assessment tool.

    If we believe that deep learning is promoted

    by active engagement and that cognitive skills

    are developed in a social context (Lipman,

    1991; Resnick, Levine, & Teasley, 1991; Tu &

    Corry, 2002), then threaded discussions within

    a virtual learning community could promote

    deep learning. Right? But, how do we measure

    deep learning or critical thinking within virtual

    communities? Henri (1992) suggests five

    dimensions: (1) participative; (2) social;

    (3) interactive; (4) cognitive; and (5) metacog-

    nitive. For the purposes of this study, the

    researchers chose to focus on the cognitive

    dimensions only because they were interested

    in measuring learning, not only participation

    (self-direction) and social functions within a

    virtual learning community.

    Garrison (1992) provided a five-stage model

    to measure critical thinking skills: (1) problem

    identification; (2) problem definition; (3) prob-

    lem exploration; (4) problem evaluation/appli-

    cability; and (5) problem integration. Newman,

    Webb, and Cochrane (1996) eloquently com-

    bined these two models with Masons (1992)

    suggestions based on the educational value

    exhibited within online discussion: Do the

    learners build on previous messages? Do they

    draw on their own experience? Do they refer to

    course materials? Do they refer to relevant

    material outside the course? Do they initiate

    new ideas? The resulting model codes provide

    indicators of critical and uncritical thinking in

    10 areas: (1) relevance; (2) importance; (3) nov-

    elty; (4) outside knowledge or experience being

    brought to bear on the problem; (5) ambiguities

    clarified or confused; (6) linking ideas; (7) jus-

    tification; (8) critical assessment; (9) practical

    utility (grounding); and (10) width of under-

    standing. This model served as the theoretical

    framework for this study. Previous studies have

    used the content analysis method to measure

    critical thinking in face-to-face and computer-

    supported group learning for whole class

    instruction, but they have not examined the

    impact of smaller learning communities within

    a course. Would there be differences in deep

    learning or critical thinking if students are only

    exposed to their own learning team rather than

    the breadth of discussion and perspectives from

    the entire class?

    PURPOSE AND RESEARCH

    OBJECTIVES

    The purpose of this study was to determine if

    the critical thinking skills model posed by

    Newman et al. (1996) could be used to indicate

  • 188 The Quarterly Review of Distance Education Vol. 7, No. 2, 2006

    quality of online discussion when the learners

    are placed in smaller learning communities

    and not exposed to whole-class discussion.

    Would there be discrepancies among the

    groups? Would one team have a great amount

    of critical thinking within their discussion

    while one team only scratched the surface?

    Would students not receive the full benefit of

    learner-to-learner interaction if placed in a

    smaller group, or would they have greater inti-

    macy and deeper conversations as a result of

    being in a smaller group?

    METHODS

    For this study, the primary source of data was

    narrative. Therefore, acceptable qualitative

    research standards drove the methods (Lincoln

    & Guba, 1985). There were 30 respondents

    within 6 virtual learning communities (5 learn-

    ers in each community). Each learner was

    given a number based on the order they

    responded, 1-5A for virtual team A; 1-5B for

    virtual team B, and so forth. Research proce-

    dures included a review of all 11 discussion

    forums for the semester, which resulted in a

    substantial amount of data. It was determined

    that all of the discussion topics had consis-

    tency in length and quality of postings; there-

    fore, one discussion forum was selected for

    content analysis. The topic of this discussion

    was the strengths and challenges of learner-

    centered instruction. The graduate students

    enrolled in this course read a chapter from their

    text and additional research articles on learner-

    centered instruction prior to engaging in the

    discussion. The instructions were that each

    learner was required to submit an original

    posting and reply to at least one other virtual

    team member. The instructor made it clear that

    the basis for grading was quality, not quantity

    of posting, but the learners were not given the

    framework for analysis.

    The critical thinking skills model (Newman

    et al., 1996) was used as the basis for analysis.

    The researchers created a color-coding system

    with highlighters to identify the 10 major cate-

    gories within the model. The researchers used

    a read aloud protocol with consensus-build-

    ing measures. An audit trail was kept to verify

    the data sources for each of the critical think-

    ing categories by color and number for each

    respondent within a virtual team. In order for

    transferability to occur, the researchers chose

    exemplary discussions for each of the ten indi-

    cators to provide thick description (Geertz,

    1973).

    FINDINGS

    The findings are presented with exemplary dis-

    cussion examples for each critical-thinking

    category across all the virtual learning commu-

    nities. An audit trail with respondent codes

    serves as a trustworthiness measure of the

    presence of critical thinking across the 10 cat-

    egories within the theoretical model.

    For relevance, the researchers were looking

    for relevant or irrelevant statements and/or

    diversions from the topic. An example of a rel-

    evant statement found in virtual community A

    is:

    Learner centered instruction refers to

    actively involving students in the planning,

    implementation, and self-evaluation pro-

    cess of their education. Students who are

    involved in their own learning relate better

    to the material and process information

    according to their own learning type. An

    increase in desire to learn and problem

    solving skills are the product. (4A)

    The second model code is importance. The

    researchers were looking for important point/

    issues or whether the learners used unimpor-

    tant, trivial points/issues for this category. One

    respondent noted:

    Instruction will require teachers to prepare

    in advance, acquire the needed resources,

    and effectively monitor and provide feed-

    back to the students. Also, the classroom

    can become noisy and perhaps somewhat

    disorderly during certain times of the day

    since many different activities could be

    occurring at the same time. (5B)

  • A Content Analysis of Critical Thinking Skills 189

    The third category is novelty or new infor-

    mation, ideas, or solutions. This category

    includes putting forth within the discussion

    new problem-related information, new ideas

    for discussion, new solutions to problems, wel-

    coming new ideas, and bringing new items in

    the discussion. On the contrary, if a respondent

    repeats what has been said, provides false or

    trivial leads, accepts the first solution offered,

    or has to be dragged into the discussion by the

    instructor, this would be a negative indicator of

    novelty. A good example is 1F, who com-

    mented, There are teachers and professors

    who cling to teacher-oriented approaches

    (such as lecture) because of fear. The fear is

    also generated by pressure, failure, and lazi-

    ness.

    The next category is bringing outside

    knowledge or experience to bear on the prob-

    lem. A positive indicator would be drawing on

    personal experience, referring to course mate-

    rials, using relevant outside material or previ-

    ous knowledge, and incorporating course-

    related problems brought in from lectures,

    texts, and other materials. If learners rely only

    on their preconceived notions and assump-

    tions, then this is a negative indicator. To illus-

    trate, 3B discussed his experiences abroad:

    I have spent time in several foreign coun-

    tries and I have lived abroad for two years.

    In most places that I have traveled I made a

    conscious effort to learn the language of the

    people and culture whom I am visiting. I do

    this for two reasons. First, I LOVE to learn.

    Secondly, the native people of the country

    that I am visiting always seem to appreciate

    my effort and this results in a very friendly

    rapport for my time in the country. But

    from this I learned that discovery learning,

    or learning in a personal context has been

    more effective to me picking up the lan-

    guage as opposed to reading a book to

    studying vocabulary flash cards.

    Another respondent provided these

    insights:

    Im faced with students that without con-

    stant direction they would be totally lost

    and find it hard to keep up. I try to build the

    self-confidence and esteem of my students

    at an early age and sometimes when faced

    with students that dominate the lessons

    their egos could be affected. I feel that the

    best of both worlds, traditional and a

    learner-centered environment is the best

    way to go. (5A)

    The fifth category is ambiguitieswhether

    the respondent had clear and unambiguous

    statements or confused statements. Only four

    responses (2A, 3D, 3F, 5F), were ambiguous

    within this topic. A negative indicator of ambi-

    guity is:

    I think that the learner needs to define what

    causes learning to happen. When they

    understand what learning is being sought

    then they will find all the different methods

    of learning. Therefore, this is why we need

    to help students recognize these abilities.

    (2A)

    The next model code is linking ideas, imply-

    ing that the learner links facts, ideas and

    notions and generates new data from informa-

    tion collected. If the learner repeats informa-

    tion without making inferences or interpreta-

    tion or states that he or she shares the idea or

    opinion without taking these further, it is a neg-

    ative indicator. After a statement made by 1D

    in an area of importance, the respondent linked

    by adding:

    There is so much more information readily

    available to students through the Internet.

    The teacher is no longer the source of

    knowledge. Students can acquire informa-

    tion and act on that information them-

    selves. This acquisition of information is a

    strength of leaner-centered instruction. By

    using the Internet, or other technologies,

    students learn to think critically about what

    information isis it opinion, fact, etc.

    The next category is justification. A respon-

    dent who uses critical thinking within a post

    should provide proof or examples to justify his

    or her solutions or judgments. This includes

    discussing advantages and disadvantages of

    solutions posed. If the learner uses irrelevant

    or obscure questions or examples or offers

  • 190 The Quarterly Review of Distance Education Vol. 7, No. 2, 2006

    judgments without explanations, this is a nega-

    tive indicator of justification. In discussing

    strengths about learner-centered instruction,

    2C justified by stating:

    In a learner centered classroom the teacher

    is working along side the students in seek-

    ing information that the student wants and

    needs to learn. This promotes a positive

    student teacher relationship where the stu-

    dent can feel unthreatened and able to

    approach the teacher for information. This

    aspect of the learner centered environment

    can put a lot of strain on a teacher because

    there may be several students and one

    teacher. The teacher is left trying to meet

    the demands of many students, and the stu-

    dents may feel that their needs are not

    being met.

    The eighth category to determine quality of

    critical thinking is critical assessment. When a

    learner poses an idea, do they critically assess

    or evaluate their idea or the ideas of others or

    do they simply accept what is being said or

    unreasonably reject without additional input?

    To illustrate this point, one respondent in vir-

    tual team F disagreed with another team mem-

    bers post.

    I disagree with the statement, giving up

    control of the classroom, as some opposi-

    tion may think. With society and technol-

    ogy on the rise, this allows students more

    control and independency of their educa-

    tion. Teachers are implementers and can

    enhance and even add to previous knowl-

    edge. I use this statement for older students

    that have already learned the basics. (3F)

    The next category is practical utility

    (grounding). If the respondent relates possible

    solutions to familiar situations and discusses

    practical utility of new ideas, then it is a posi-

    tive indicator. If he or she discusses ideas in a

    vacuum or suggests impractical solutions, then

    it is negative. Because most respondents were

    teachers and educators, many included practi-

    cal examples, as indicated in Table 1; how-

    ever, the following is an example of practical

    utility:

    This other edge of the sword though is their

    current ill-preparedness to adapt to the

    learner-centered approach. This past spring

    semester I moved from a pure lecture

    approach to a lecture with PP[T] slides (and

    handouts) and study guides to work. My

    goal was to engage my students more in

    their own learning. I placed a grade compo-

    nent (10% of the final grade) on attendance

    and study guide completion. I did in fact

    see the grades improve over those of past

    semesters. However, after the second test,

    when asked by what turned out to be a

    mediocre student why the test was so hard,

    I pointed out that several in other classes

    had received great grades. She responded

    Oh I know one of them but she studies all

    the time. That statement alone points out,

    at least at the higher education level, one of

    the greatest challengesthat of providing

    new tools to the student who is ill prepared

    or undisciplined to use the tools. (4D)

    The last model code is width of understand-

    ing. Learners who widen the discussion within

    a broader perspective or provides intervention

    strategies within a wider framework get the big

    picture. If they narrow the discussion or

    address only fragments of a situation, they are

    not contributing width of understanding.

    Respondent 2E, in discussing variables to con-

    sider in learner-centered instruction, added:

    Since it hasnt ever worked the way we

    thought it would, perhaps we need a differ-

    ent approach. Starting at the top level, the

    teacher, and go down from there. Im won-

    dering if the student is much of a variable at

    all here. I guess thats why the learner is in

    the center of the learner-centered environ-

    ment. Hmm.

    For the 10 categories within the critical

    thinking skills indicators audit trail codes were

    kept for each team (A-F) with five respondents

    in each team. Table 1 provides a visual snap-

    shot of the presence of the critical thinking cat-

    egories within each group.

    The researchers noticed some postings fully

    integrating most or all of the components of

    the critical thinking model. This integration

    cannot be expected in every post, but does

    highlight the cognitive complexity of individu-

  • A Content Analysis of Critical Thinking Skills 191

    als within each team. The following is an

    example of integration of all components

    within the critical thinking model:

    Learner-centered instruction places the

    focus or control of learning on individual

    students. The teacher's role changes to

    facilitator and often are considered a co-

    learner. The student becomes an active par-

    ticipant in exploratory learning.

    There are many strengths of learner-cen-

    tered instruction. Students are active partic-

    ipants and take more responsibility in their

    learning as they explore and discover dif-

    ferent topics. Learner-centered instruction

    allows for opportunities for student interac-

    tion and collaboration. This interaction can

    occur with the teacher, other students, and

    even other people not enrolled in the class.

    Motivation can also increase as the student

    is allowed selection of topics and strate-

    gies. However, the strongest advantage of

    learner-centered instruction I see is the

    development of lifelong learners. Students

    are not only gaining subject matter knowl-

    edge, they also gain problem-solving skills

    and collaboration skills.

    There are also a few challenges of

    learner-centered instruction. As stated in

    the text, a single approach to all instruction

    will not work. The first challenge involves

    the need to change the philosophy or atti-

    tude of experienced teachers, administra-

    tors and some of the parents. Approaches

    used in previous years need to be modified

    to incorporate new technologies and meth-

    ods of learning. Instruction will require

    teachers to prepare in advance, acquire the

    needed resources, and effectively monitor

    and provide feedback to the students. Also,

    the classroom can become noisy and per-

    haps somewhat disorderly during certain

    times of the day since many different activ-

    ities could be occurring at the same time.

    As you can see there are many things to

    consider when evaluating the use learner-

    centered instruction in the schools. I feel

    that if this method promotes lifelong learn-

    ers then it is definitely worth pursuing.

    (5B)

    CONCLUSIONS AND IMPLICATIONS

    Based on the exemplary discussion examples

    provided for each indicator in the critical

    thinking skills model, a summary of findings

    along with implications will be provided in

    relation to the purpose and research questions

    posed in this study. The researchers sought to

    determine if the critical thinking skills model

    (Newmann et al., 1996) could be used to indi-

    cate quality of discussion in an online forum

    when learners were placed in smaller learning

    communities as opposed to exposure to whole

    class discussions.

    Analysis of the discussions determined that

    all individuals within each learning commu-

    nity did engage in relevant discussion and

    TABLE 1Audit Trail Codes of Critical Thinking Skill Indicators by Respondent Within Virtual Learning Communities

    Categories

    Respondents by Team

    A B C D E F

    Relevance 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5

    Importance 1, 3, 4, 5 2, 3, 5 1, 2, 4 2, 4 2, 5 1, 3

    Novelty 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5 1, 2, 4, 3, 5 1, 3, 4, 5, 2 1, 3, 2, 4, 5

    Outside knowledge 3, 4, 1, 5 1, 3, 4, 5 2, 3, 4 1, 2, 3, 4, 5 2, 4, 1, 5 1, 3

    Ambiguities 2 3 3, 5

    Linking 1, 3, 2, 5 3, 1, 5, 4 1, 3, 4, 2, 5 1, 2, 4, 3, 5 1, 2, 4, 5, 3 1, 2, 3, 4

    Justification 3, 4, 2, 5 1, 2, 3, 5 1, 3, 2, 4, 5 1, 3, 4, 2, 5 1, 2, 3, 4, 5 1, 2, 3, 4, 5

    Critical assessment 1, 4, 5 2, 3, 1, 5 1, 2, 4 2, 4, 1 3, 2, 4, 5 1, 3, 4

    Practical utility 5 3, 1, 5, 4, 2 1, 3 4, 2 1, 2, 5 1, 3, 4

    Width of

    understanding

    1, 4, 2, 5 3, 5, 1, 2 2, 4 1, 2, 3, 4 2, 1 1, 4, 3

  • 192 The Quarterly Review of Distance Education Vol. 7, No. 2, 2006

    brought in new/novel ideas to their commu-

    nity. The majority of the respondents by team

    did manage to incorporate several of the cate-

    gories, but not all learners fully integrated the

    10 components of the critical thinking skills

    model within their discussion. However, each

    virtual learning community had at least one

    person whose discussions did fully integrate 9-

    10 of the categories. Critical assessment, pro-

    viding practical utility, important statements,

    and demonstrating width of understanding

    were the four components within the model

    where a few communities had only two-three

    respondents that integrated these categories.

    Adult learners bring in their prior experiences

    and knowledge into the classroom and engage

    in a discussion regarding issues relevant to

    their situation. The implication exists that

    learners with more experience impact the

    learner-learner interaction within that virtual

    learning community.

    The researchers found that several new/

    novel ideas within each group were conver-

    gent; however, divergent novel ideas were also

    discovered among communities that drove the

    conversations. Even with the introduction of

    divergent ideas, the discussions remained rele-

    vant to the topic. Further research is needed to

    determine convergent and divergent themes

    that emerge within the virtual learning com-

    munities via a cross case analysis. Would there

    be specific patterns of reflection and metacog-

    nition unique to individuals in the groups that

    could shape the quality of the discussions?

    Findings indicate that all groups engaged in

    critical thinking within their virtual learning

    communities and a high amount of interaction

    occurred within each community; however,

    further research is needed to determine if the

    same level of critical thinking and interaction

    would occur if students were exposed to and

    expected to interact in a whole class discus-

    sion. Would students seek a few individuals

    within the class and engage in a meaningful

    discussion? Would they self select a smaller

    community and continue throughout the

    semester in the discussion forums with the

    same individuals similar to the virtual learning

    communities?

    The critical thinking skills model provided

    an excellent framework for content analysis of

    discussion threads within the virtual learning

    communities. As mentioned previously further

    research is needed to assess quality of online

    discussions. Although descriptive, these mod-

    els are time consuming and instructors and

    researchers in the field of educational technol-

    ogy need to continue to explore the utilization

    of rubrics for assessment of critical thinking in

    online discussions. The use of smaller virtual

    learning communities for this study has been

    shown to have equivalent critical thinking

    capacity across teams. This will enable the

    instructor to design online learning with the

    assurance that deeper learning and critical

    thinking will occur without students having to

    read and reply to whole class discussions.

    REFERENCES

    Edelstein, S., & Edwards, J. (2003). If you build it,

    they will come: Building learning communities

    through threaded discussions. Online Journal of

    Distance Learning Administration, 51.

    Retrieved November 9, 2004, from http://

    www.westga.edu?~distance/ojdla/spring51/

    edelstein51.html

    Garrison, D. R. (1992). Critical thinking and self-

    directed learning in adult education: An analysis

    of responsibility and control issues. Adult Edu-

    cation Quarterly, 42(3), 136-148.

    Geertz, C. (Ed.). (1973). Thick description: Toward

    an interpretive theory of culture. In The interpre-

    tation of cultures (pp. 5-30). New York: Basic

    Books

    Henri, F. (1992). Computer conferencing and con-

    tent analysis. In A. R. Kaye (Ed.), Collaborative

    learning through computer conferencing (pp.

    117-136). Berlin, Germany: Springer-Verlag.

    Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic

    inquiry. Newbury Park, CA: Sage.

    Lipman, M. (1991). Thinking in education. Cam-

    bridge, MA: Cambridge University Press.

    Marra, R. M. (2002). The ideal online learning envi-

    ronment for supporting epistemic development:

    Putting the puzzle together. Quarterly Review of

    Distance Education, 3(1), 15-31.

  • A Content Analysis of Critical Thinking Skills 193

    Marra, R. M., Moore, J. L., & Klimczak, A. K.

    (2004). Content analysis of online discussion

    forums: A comparative analysis of protocols.

    ETR&D, 52(2), 23-40.

    Mason, R. (1992). Evaluation methodologies for

    computer conferencing applications. In A. R.

    Kay (Ed.), Collaborative learning through com-

    puter conferencing (pp. 105-116). Berlin, Ger-

    many: Springer-Verlag.

    Moore, M. G. (1989). Editorial: Three types of

    interaction. The American Journal of Distance

    Education, 3(2), 1-6.

    Newman, D. R., Webb, B., & Cochrane, C. (1996).

    A content analysis method to measure critical

    thinking in face-to-face and computer supported

    group learning. Retrieved April 25, 2005, from

    http://www.qub.ac.uk/agt/papers/methods/

    contpap.html

    Northrup, P. T. (2002). Online learners preferences

    for interaction. Quarterly Review of Distance

    Education, 3(2), 219-226.

    Palloff, R. M., & Pratt, K. (1999). Building learning

    communities in cyberspace: Effective strategies

    for the online classroom. San Francisco: Jossey-

    Bass.

    Resnick, L., Levine, J., & Teasley, S. (1991). Per-

    spectives on socially shared cognition. Washing-

    ton, DC: American Psychological Association.

    Simonson, M., Smaldino, S., Albright, M., &

    Zvacek, S. (2003). Teaching and learning at a

    distance: Foundations of distance education

    (2nd ed.). Upper Saddle River, NJ: Merrill Pren-

    tice Hall.

    Tu, C., & Corry, M. (2002). eLearning communi-

    ties. Quarterly Review of Distance Education,

    3(2), 207-218.

    Visser, L., Visser, Y. L., & Schlosser, C. (2003).

    Critical thinking in distance education and tradi-

    tional education. Quarterly Review of Distance

    Education, 4(4), 401-407.