6
Learning through science writing via online peer assessment in a college biology course Jyh-Chong Liang a , Chin-Chung Tsai b, a Graduate Institute of Engineering, National Taiwan University of Science and Technology, #43, Sec.4, Keelung Rd., Taipei, 106, Taiwan b Graduate Institute of Digital Learning and Education, National Taiwan University of Science and Technology, #43, Sec.4, Keelung Rd., Taipei, 106, Taiwan abstract article info Article history: Accepted 26 April 2010 Keywords: Online technology Peer assessment Science writing Self-assessment College biology This study used an online peer assessment activity to help 47 college students to learn biology through writing. Each student submitted a biology writing report to an online system and then experienced three rounds of peer assessment. During the online peer assessment process, self, peer and expert evaluation scores for the writing were gathered across three rounds. It was found that self-assessment scores were not quite consistent with the expert's scores, but the peer assessment scores demonstrated adequate validity with the expert's evaluation. In particular, when the students had more rounds of peer assessment for reviewing the writing, the validity of the peer scores was enhanced. An examination of the students' writing scores, allocated by peers and expert, indicated that the students signicantly improved the writing as the peer assessment activity proceeded. Content analyses of the students' writing also revealed that their writing gradually developed with signicantly better coverage, richness and organization resulting from the online peer assessment activity. © 2010 Elsevier Inc. All rights reserved. 1. Introduction There has been strong interest among educational researchers in using writing as a means of enhancing students' learning (Beall, 1998; Prain, 2006). Relevant studies focus on the research issues regarding how writing can enhance students' learning, how writing affects students' learning, and the potential learning outcomes derived from writing (Hand, Hohenshell & Prain, 2007). Moreover, some studies indicate that writing with learning protocols or prompts can also foster students' self-regulated learning (Klein, 1999; Nuckles, Hubner & Renkl, 2009). For undergraduate science or engineering students, science writing skills are very important for their academic studies and their own future work (Finegold, 2002). In their science-related careers in the future, they might need to complete technical or scientic written reports along with scientic industry projects for themselves or their team (Venables & Summit, 2003). In the last year of undergraduate study, some schools or advanced courses often require students to produce written reports, which contain suitable data collection, careful analysis and in-depth discussion including current references. However, during the school year, there is little or no opportunity for the students to practice science writing; thus they may lack the proper skills to successfully complete the writing and achieve adequate writing quality (Venables & Summit, 2003). One of the solutions to this problem is to create more opportunities for college students to write in different aspects and courses that enable them to devote themselves to organizing their ideas and under- standings in a better way, thus enhancing the writing quality (Prain, 2006). Past studies of science writing have sought to explore benets or effects on different aspects of students' learning. For example, Klein (1999) analyzed the science writing of preservice education students when observing a science experiment, with the result showing that they increased the complexity of their science writing in generating explanations. Keys, Hand, Prain and Collins (1999) used the science writing heuristic as a tool for learning from laboratory activities in secondary science, and found that this tool facilitated students' adequate understanding of the nature of science. Researchers have also investigated the effects of discussion and writing on learning science, with the results showing that discussing with peers combined with writing appears to enhance the retention of science concepts over time (Rivard & Straw, 2000). In Hand et al.'s (2007) study, the effects of multiple writing tasks on students' understanding of cell and molecular biology concepts were examined (Hand et al., 2007). They found that multiple and non-conventional kinds of writing do help students to learn biology. Trautmann (2009) analyzed the impacts of toxicology experiment writing on the revision of research reports in a computer-supported collaborative environment by undergraduate science students. Trautmann (2009) concluded that receiving reviews from peers was positively related to the revisions of the reports. The studies discussed how to enhance science learning through writing; Internet and Higher Education 13 (2010) 242247 Corresponding author. Tel.: + 886 2 27376511; fax: + 886 2 27376433. E-mail addresses: [email protected] (J.-C. Liang), [email protected] (C.-C. Tsai). URL: http://www.cctsai.net (C.-C. Tsai). 1096-7516/$ see front matter © 2010 Elsevier Inc. All rights reserved. doi:10.1016/j.iheduc.2010.04.004 Contents lists available at ScienceDirect Internet and Higher Education

Learning through science writing via online peer assessment in a college biology course

Embed Size (px)

Citation preview

Page 1: Learning through science writing via online peer assessment in a college biology course

Internet and Higher Education 13 (2010) 242–247

Contents lists available at ScienceDirect

Internet and Higher Education

Learning through science writing via online peer assessment in a collegebiology course

Jyh-Chong Liang a, Chin-Chung Tsai b,⁎a Graduate Institute of Engineering, National Taiwan University of Science and Technology, #43, Sec.4, Keelung Rd., Taipei, 106, Taiwanb Graduate Institute of Digital Learning and Education, National Taiwan University of Science and Technology, #43, Sec.4, Keelung Rd., Taipei, 106, Taiwan

⁎ Corresponding author. Tel.: +886 2 27376511; fax:E-mail addresses: [email protected] (J.-C. Liang

(C.-C. Tsai).URL: http://www.cctsai.net (C.-C. Tsai).

1096-7516/$ – see front matter © 2010 Elsevier Inc. Aldoi:10.1016/j.iheduc.2010.04.004

a b s t r a c t

a r t i c l e i n f o

Article history:Accepted 26 April 2010

Keywords:Online technologyPeer assessmentScience writingSelf-assessmentCollege biology

This study used an online peer assessment activity to help 47 college students to learn biology throughwriting. Each student submitted a biology writing report to an online system and then experienced threerounds of peer assessment. During the online peer assessment process, self, peer and expert evaluationscores for the writing were gathered across three rounds. It was found that self-assessment scores were notquite consistent with the expert's scores, but the peer assessment scores demonstrated adequate validitywith the expert's evaluation. In particular, when the students had more rounds of peer assessment forreviewing the writing, the validity of the peer scores was enhanced. An examination of the students' writingscores, allocated by peers and expert, indicated that the students significantly improved the writing as thepeer assessment activity proceeded. Content analyses of the students' writing also revealed that their writinggradually developed with significantly better coverage, richness and organization resulting from the onlinepeer assessment activity.

+886 2 27376433.), [email protected]

l rights reserved.

© 2010 Elsevier Inc. All rights reserved.

1. Introduction

There has been strong interest among educational researchers inusing writing as a means of enhancing students' learning (Beall, 1998;Prain, 2006). Relevant studies focus on the research issues regardinghow writing can enhance students' learning, how writing affectsstudents' learning, and the potential learning outcomes derived fromwriting (Hand, Hohenshell & Prain, 2007). Moreover, some studiesindicate that writing with learning protocols or prompts can alsofoster students' self-regulated learning (Klein, 1999; Nuckles, Hubner& Renkl, 2009).

For undergraduate science or engineering students, sciencewriting skills are very important for their academic studies andtheir own future work (Finegold, 2002). In their science-relatedcareers in the future, they might need to complete technical orscientific written reports along with scientific industry projects forthemselves or their team (Venables & Summit, 2003). In the last yearof undergraduate study, some schools or advanced courses oftenrequire students to produce written reports, which contain suitabledata collection, careful analysis and in-depth discussion includingcurrent references. However, during the school year, there is little orno opportunity for the students to practice science writing; thus theymay lack the proper skills to successfully complete the writing and

achieve adequate writing quality (Venables & Summit, 2003). One ofthe solutions to this problem is to create more opportunities forcollege students to write in different aspects and courses that enablethem to devote themselves to organizing their ideas and under-standings in a better way, thus enhancing the writing quality (Prain,2006).

Past studies of science writing have sought to explore benefits oreffects on different aspects of students' learning. For example, Klein(1999) analyzed the science writing of preservice education studentswhen observing a science experiment, with the result showing thatthey increased the complexity of their science writing in generatingexplanations. Keys, Hand, Prain and Collins (1999) used the sciencewriting heuristic as a tool for learning from laboratory activities insecondary science, and found that this tool facilitated students'adequate understanding of the nature of science. Researchers havealso investigated the effects of discussion and writing on learningscience, with the results showing that discussing with peers combinedwith writing appears to enhance the retention of science conceptsover time (Rivard & Straw, 2000). In Hand et al.'s (2007) study, theeffects of multiple writing tasks on students' understanding of cell andmolecular biology concepts were examined (Hand et al., 2007). Theyfound that multiple and non-conventional kinds of writing do helpstudents to learn biology. Trautmann (2009) analyzed the impacts oftoxicology experiment writing on the revision of research reports in acomputer-supported collaborative environment by undergraduatescience students. Trautmann (2009) concluded that receiving reviewsfrom peers was positively related to the revisions of the reports. Thestudies discussed how to enhance science learning through writing;

Page 2: Learning through science writing via online peer assessment in a college biology course

243J.-C. Liang, C.-C. Tsai / Internet and Higher Education 13 (2010) 242–247

however, not many studies addressed how to better evaluate thequality of science writing directly. As the evaluation of science writinginvolves various aspects (e.g., its scientific validity, consistency orrichness), it is more difficult to assess science writing than traditionaltests in science. Similar to the peer reviews of academic publicationsby journals, this study proposes that the use of peer assessment maybe a potential way to evaluate science writing.

The usage of peer assessment in higher education has beengradually implemented (Matsuno, 2009; Topping 1998; Tseng & Tsai,2010). The implementation of peer assessment is found to reduceteachers' load and improve the quality of students' learning from thefeedback of their peers (Bouzidi & Jaillet, 2009; Tsai, Lin & Yuan,2002). Peer assessment activities have been conducted with differentlearning subjects, such as science (Carlson & Berry, 2008; Tsai et al.,2002; Tsai & Liang, 2009), writing (Cho, Schunn & Wilson, 2006; vanden Berg, Admiraal & Pilot, 2006), and language (Matsuno, 2009). Inparticular, “writing” is a popular subject for educators to undertakepeer assessment (e.g., Gielen, Peeters, Dochy, Onghena & Struyven,2010; Venables & Summit, 2003; Yang & Tsai, 2010). In peerassessment activities, students are engaged in writing and havemore chances to get and/or provide feedback than they have whenconfined to reviewing only their own writing (Trautmann, 2009). Inother words, students not only learn how to write their own reports,but also learn more from evaluating others' writing, which suppliesmore ideas about how to modify their own writing. Students benefitfrom playing the role of assessor for their peers, and also gainfeedback about their strengths and weaknesses from their peers insuch activities (Tsai, 2009; Xiao & Lucking, 2008). Most studentsperceive improvement in their writing as a result of peer assessment(van den Berg et al., 2006). Previous studies also indicate that peerassessment is as valid as the instructor's judgment, and it is concludedthat validity should not be a barrier to its implementation (Cho et al.,2006; Orsmond, Merry & Reiling, 2000; Topping, 2008).

Another form of evaluation, which is often combined or consideredtogether with peer assessment, is self-assessment (Papinczak, Young,Groves & Haynes, 2007). Students develop their evaluation skills fromself-assessment and are able to compare their self evaluations andthose of others. However, some previous studies have indicated thatself-assessment scores neither correlate with peers' scores nor withtutors' or teachers' scores (Langan, Shuker, Cullen, Penney, Preziosi &Wheater, 2008; Matsuno, 2009; Papinczak et al., 2007). Nevertheless,many researchers (e.g., Orsmond et al., 2000; Sluijsmans, Dochy &Moerkerke, 1998) have highlighted the importance of both self-assessment and peer assessment for learning enhancement.

Online technology or Internet learning environments provide anew approach to administering peer assessment (Tsai & Liang, 2009;Wen & Tsai, 2006). Several studies regarding online peer assessmentshow the benefits of utilizing online technology for both students andteachers (Hou, Chang & Sung, 2007; Xiao & Lucking, 2008). Properusage of online environments for peer assessment could supply ahigher degree of anonymity and provide more freedom of time andlocation for the students, thus stimulating feedback exchange amongpeers (Tsai & Liang, 2009; Tsai, Liu, Lin & Yuan, 2001), and alsofostering students' favorable attitudes toward peer assessment (Wen& Tsai, 2006). Research has also found that students take peer reviewsseriously insofar as they provide thorough and constructive com-ments (Bauer, Figl, Derntl, Beran & Kabicher, 2009). Tseng and Tsai(2007) also showed the importance of different types of peer feedbackin students' learning enhancement in an online peer assessment task.The students receiving more constructively suggestive feedback frompeers tended to improve more for the learning task, while thefeedback in more directly corrective or didactic mode did not lead tobetter improvement. However, still few studies have utilized onlinepeer assessment to help students' learning through science writing.

In sum, the literature suggests that peer assessment is valid and itsimplementation is effective in improving students' learning perfor-

mance, especially via the incorporation of online technology. Thisstudy used an online peer assessment activity (with three rounds ofpeer review) to help a group of undergraduate students to learnbiology through writing relevant science reports. The correlationsamong expert, peer and self-assessment were investigated. Thepossible improvement in the quality of their science writingthroughout the different rounds of the online peer assessment wasalso explored.

2. Method

2.1. Participants

The participants in this study included 47 college students (21male and 26 female) in a technology institute in Taiwan, who enrolledin a biology course. They were students who had joined anundergraduate program majoring in sports science. One of the courserequirements asked each student to write a report about anybiological topic (cell, plant, insect, or animal) by his/her own interest.The topic chosen should explore more biological knowledge than thatwas taught by the teacher in the course.

2.2. Online peer assessment module in the biology course

At the beginning of the biology course, each student discussedwith the teacher about the topic chosen for the writing, and was thenallowed to search for information online before being required towrite a preliminary biology-relevant report. For example, the selectedtopics included the possible reasons why more and more bees havebeen disappearing recently, the competing theories of dinosaurs'extinction, coral bleaching in the Taiwan Strait, and the ecologicaltensions between some indigenous and alien species in Taiwan. Therequirements for the writing report included the specific topic inbiology, background information, scientific findings, results orsummary and references. Each student then submitted his/her reportto an online peer assessment system. Then, the system assigned it forpeer review. Each report was assessed by five peers chosen randomly,and thus each student also assessed five peers' reports. After beingassessed by their peers, the students revised their ownwriting reportsaccording to their peers' comments and suggestions. The peerassessment in this class was conducted in three rounds. That is, thestudents were assigned to assess their peers' reports three times, andthey also needed to revise their own reports twice (with the processincluding initial submission, first peer assessment, revision submis-sion, second peer assessment, second revision submission, and thirdpeer assessment). Each report was evaluated by the same group ofreviewers in each of the different peer assessment rounds. Such onlinepeer assessment system as well as implementation procedure havebeen used by some previous studies (e.g., Chen & Tsai, 2009; Tseng &Tsai, 2007; Wen & Tsai, 2008). The peer assessment process tookabout two months. When assessing peers' reports, the review wasundertaken in an anonymous (double-blind) way. Each of theparticipants performed the roles of both author and reviewer.

2.3. Peer, self and expert scores

In each round of the peer assessment, every student's report wasscored on the following five dimensions by him/herself, his/her peersand the course teacher.

1. Knowledge: the extent to which the depth of knowledge wasdiscussed.

2. Suitability: the suitability of the chosen topic.3. Correctness: the accuracy of the biology concepts conveyed.4. Creativity: the creativity in reporting on the topic and the content.5. Overall: the overall judgment of the report.

Page 3: Learning through science writing via online peer assessment in a college biology course

244 J.-C. Liang, C.-C. Tsai / Internet and Higher Education 13 (2010) 242–247

Before conducting the peer assessment, the course teacher and thestudents had an explicit discussion about the assessment criteria.Some exercises with concrete examples were provided. The studentsalso had some training in using the online peer assessment forsubmitting reports and reviews.

Each student enrolled in this class first submitted his/her report tothe peer assessment system online and scored him/herself (i.e., self-assessment) from 1 to 7 points on each dimension described above.Then he/she scored five peers' reports from 1 to 7 points on eachdimension. In addition to these quantitative evaluations, each studentalso needed to provide his/her peers with some concrete suggestionsor advice to revise their reports in the next round. In other words,each student acquired five peers' scores and comments to revise his/her own report. This study used the average scores from the five peersto represent the “peer” scores. When the students submitted theirrevised reports, they were asked to rate their own performance alsobased on the same dimensions and score scale. The course teacher alsoscored each student's writing in each round by the same dimensions,which was viewed as the expert's scores. However, it should be notedthat the expert's scores were not revealed during the process of theonline peer assessment activity. Hence, they would not influence thepeers' judgments or self evaluations. Using this method, this studygathered peer, self and expert evaluations for each dimension in eachround.

2.4. Content analyses: some additional quantitative indicators

In addition to analyzing the peer, self and expert scores during theprocess, the content of the reports was further analyzed in the threerounds of the peer assessment activity. The content analysis included:the total number of words, pictures (or figures), tables, (explanatory)notes for pictures, references and subheadings. These indicators weredetermined by two experts, and were viewed as important and easyto process quantitative analyses. These indicators were considered assuitable to moderately represent the coverage of the science writings(e.g., the total number of words), the richness and variation of theinformation included (e.g., the number of pictures, tables andreferences), and the clarity of the writing (e.g., the number of notesfor pictures and subheadings). Higher values for these indicators showbetter extent, richness, and structure of the writing.

3. Results

3.1. The correlation between self and expert scores

Table 1 shows the correlation between the self and expert scoresfor the five dimensions in each assessment round. As shown inTable 1, among the fifteen outcome variables across the three rounds,there were only three positively significant correlations between theself and expert scores. They are the ‘Knowledge’, ‘Suitability’ and‘Overall’ dimensions in the first round, indicating that the expert andstudents' self scores were only statistically consistent for these threedimensions in the first round. The expert and students' self scoreswere not statistically consistent in the later rounds (i.e. the secondand third rounds). In other words, there was only limited agreementbetween the expert's and students' self-assessments in the first round,

Table 1The correlation between expert and self-assessment scores for each outcome variable.

Knowledge Suitability Correctness Creativity Overall

First round 0.34⁎ 0.34⁎ 0.26 0.22 0.38⁎⁎

Second round −0.02 −0.17 −0.13 0.02 −0.03Third round 0.22 −0.05 0.26 0.10 0.20

⁎ pb0.05.⁎⁎ pb0.01.

and no agreement in the later rounds. The validity of the self-assessment was therefore not high in light of these research findings.

3.2. The correlation between peer and expert scores

Table 2 shows the correlation between the peer and expert scoresfor the five dimensions in each assessment round. According toTable 2, in the first round, the ‘Suitability’, ‘Creativity’ and ‘Overall’dimensions show significant correlations, indicating that the expertand peer scores were statistically consistent for these three dimen-sions. Nevertheless, in the second and third rounds, the expert andpeer scores were all statistically consistent, revealing greateragreement between the expert's and students' assessments. It wasalso found that the correlation coefficients between the peer andexpert scores increased throughout the peer assessment rounds. Inthe third round, the coefficients ranged from 0.40 to 0.66, indicatingquite good consistency between the peer and expert scores. In otherwords, it is a tendency that when students gain more experience ofpeer assessment, they learn more about evaluating their peers'reports, and in turn their scores are more in harmony with those oftheir teacher. In general, the peer assessment scores demonstratedgood validity with the expert.

3.3. The effects of online peer assessment on the development of students'biology writing

To explore the possible progress in the students' science writing,this study used the students' scores allocated by their peers and theexpert across the three rounds to examine the changes throughoutthe online peer assessment activity. As the self-assessment was notfound to reveal sufficient validity, its scores were not included in thispart of the analyses. The descriptive data of the students' scores frompeers for the five dimensions for each round are shown in Table 3.Table 3 shows that the students' average scores in the first round, asevaluated by their peers, were 3.97, 3.69, 3.86, 3.50 and 3.81 for thefive dimensions (Knowledge, Suitability, Correctness, Creativity andOverall), respectively. The peer scores for the second round of theassessment were 4.07, 3.92, 4.14, 3.81 and 4.04 for the fivedimensions, while the scores for the third round were 4.67, 4.60,4.73, 4.45 and 4.75. Table 3 also shows that in each round thesestudents gained increasing average scores for each dimension. Thus, aseries of paired t-tests was used to further compare the student scorechanges. The students' scores on each dimension were all statisticallyhigher in later rounds than in former rounds, except for the dimensionof ‘Knowledge’ between the first and second rounds. In almost allsituations, the students significantly improved their reports accordingto the peer evaluation scores.

Table 4 shows the scores marked by the expert (the courseinstructor). It shows that the students' average scores in the firstround, as evaluated by the expert, were 2.65, 2.72, 3.12, 2.56 and 2.87for the five dimensions, respectively. The expert scores for the secondround were 3.12, 3.34, 3.49, 3.12 and 3.31 for the five dimensions,whereas those for the third round were 3.62, 3.96, 3.88, 3.67 and 3.85.Table 4 also reveals a similar increasing trend for the differentdimensions in the different assessment rounds. The paired t-testsshow that the students' scores for each dimensionwere all statistically

Table 2The correlation between expert and peer scores for each outcome variable.

Knowledge Suitability Correctness Creativity Overall

First round 0.15 0.33⁎ −0.06 0.53⁎⁎⁎ 0.36⁎

Second round 0.40⁎⁎ 0.33⁎ 0.48⁎⁎ 0.44⁎⁎ 0.45⁎⁎

Third round 0.40⁎⁎ 0.57⁎⁎⁎ 0.42⁎⁎ 0.66⁎⁎⁎ 0.58⁎⁎⁎

⁎ pb0.05.⁎⁎ pb0.01.

⁎⁎⁎ pb0.001.

Page 4: Learning through science writing via online peer assessment in a college biology course

Table 3The scores of students' biology writing from peers' perspectives and their progression(n=47).

(1) Firstround(mean, S.D.)

(2) Secondround(mean, S.D.)

(3) Thirdround(mean, S.D.)

Pairtesta

Knowledge 3.97 (0.73) 4.07 (0.64) 4.67 (0.57) (3)N(2) (t=−9.08⁎⁎⁎)Suitability 3.69 (0.73) 3.92 (0.50) 4.60 (0.53) (3)N(2) (t=−11.33⁎⁎⁎)

(2)N(1) (t=−2.15⁎)Correctness 3.86 (0.71) 4.14 (0.61) 4.73 (0.61) (3)N(2) (t=−7.66⁎⁎⁎)

(2)N(1) (t=−2.56⁎)Creativity 3.50 (0.80) 3.81 (0.58) 4.45 (0.65) (3)N(2) (t=−7.66⁎⁎⁎)

(2)N(1) (t=−2.95⁎⁎)Overall 3.81 (0.80) 4.04 (0.64) 4.75 (0.67) (3)N(2) (t=−7.90⁎⁎⁎)

(2)N(1) (t=−2.26⁎)

a This table only lists the tests with significant difference.⁎ pb0.05.

⁎⁎ pb0.01.⁎⁎⁎ pb0.001.

245J.-C. Liang, C.-C. Tsai / Internet and Higher Education 13 (2010) 242–247

higher in later rounds than in former rounds, suggesting that thestudents significantly improved their science writing in eachdimension from the expert's point of view.

3.4. The content analyses of students' biology writing

The content analyses of the students' biology writing included thetotal number of words, pictures, tables, notes for pictures, referencesand subheadings in their reports. The results are presented in Table 5.Table 5 shows that the average number of words was 2,012.00,2,355.68 and 2,564.70 across the three rounds; the number of pictureswas 4.04, 5.98 and 8.06; the number of tables was 0.19, 0.23 and 0.21;the number of notes for pictures was 2.02, 3.09 and 5.09; the numberof references was 1.66, 2.15 and 2.34, and the number of subheadingswas 6.13, 6.64 and 7.02 in the three rounds, respectively. By and large,the number of references and tables was relatively low across threerounds. A statistical examination of the mean of the students' contentanalysis for each round revealed that there was a significantlyincreasing average number of words, pictures and notes for picturesacross all three rounds. Regarding the number of references andsubheadings, there was a significant increase from the first to thesecond round. This suggests that these two indicators could be greatlyenhanced only in the early round of peer assessment. However, therewas no significant difference in the number of tables in the reports.Although some indicators (e.g., number of tables/references) shouldexpect more improvement, the content analysis results providesupplementary indication that the students, in general, significantlyenriched their writing derived from the online peer assessment

Table 4The scores of students' biology writing from the teacher's perspective and theirprogression (n=47).

(1) Firstround(mean, S.D.)

(2) Secondround(mean, S.D.)

(3) Thirdround(mean, S.D.)

Pairtest

Knowledge 2.65 (0.61) 3.12 (0.67) 3.62 (0.89) (3)N(2) (t=−5.58⁎⁎⁎)(2)N(1) (t=−6.58⁎⁎⁎)

Suitability 2.72 (0.79) 3.34 (0.86) 3.96 (1.04) (3)N(2) (t=−7.64⁎⁎⁎)(2)N(1) (t=−6.10⁎⁎⁎)

Correctness 3.12 (0.80) 3.49 (0.75) 3.88 (0.80) (3)N(2) (t=−5.66⁎⁎⁎)(2)N(1) (t=−2.84⁎⁎)

Creativity 2.56 (0.89) 3.12 (0.93) 3.67 (1.09) (3)N(2) (t=−7.10⁎⁎⁎)(2)N(1) (t=−4.99⁎⁎⁎)

Overall 2.87 (0.70) 3.31 (0.79) 3.85 (0.96) (3)N(2) (t=−7.31⁎⁎⁎)(2)N(1) (t=−4.50⁎⁎⁎)

⁎⁎ pb0.01.⁎⁎⁎ pb0.001.

process, Their writing gradually developed with better coverage,richness and organization.

4. Discussion and conclusion

The purpose of this study was to examine students' learning ofbiology through writing via online peer assessment. First, thecorrelations between the self and expert scores were examined. Asshown in Table 1, there were few positively significant correlationsbetween the scores. Self-assessment refers to the learners makingreflective judgments of their own learning, which can help themcontemplate the status of their learning (Sluijsmans et al., 1998).Whether students evaluate themselves correctly or not is a contro-versial issue (Matsuno, 2009; Sluijsmans et al., 1998). However, in thisstudy, it seems that there were few agreements between the self andexpert scores, especially in the later rounds of the assessment. In lightof this finding, the self-assessment could not be viewed as a validassessment of their work. This result is also congruent with that ofprevious studies (e.g., Langan et al., 2008; Matsuno, 2009; Papinczaket al., 2007). Some possibilities may explain the low validity of self-assessment. The first explanation is related to individual students'idiosyncratic nature and limited experience of evaluating themselvesas part of formal assessment (Matsuno, 2009; Papinczak et al., 2007).Matsuno (2009) has also pointed out that high achieving writers canbe overly critical of themselves, which may be caused by a sense ofmodesty. In particular, some aspects of the cultural and socialenvironments in Taiwan may expect high achievers to be morehumble in terms of their expected performance; thus, these studentsmight underestimate their performance. Another possibility is thelimited time or rounds of the online peer assessment; perhaps theycould enhance and develop more adequate self-assessment abilitywith more peer feedback over time (Sluijsmans et al., 1998).

The relationship between the peer and expert scores shown inTable 2 indicates that they were statistically consistent, with theexception of ‘Knowledge’ and ‘Correctness’ in the first round. Ingeneral, the peer assessment in this study contributed to adequatevalidity. This finding is in harmony with the conclusions of previousstudies (e.g., Tsai & Liang, 2009; Wen & Tsai, 2008). However, in thebeginning of the peer assessment activity, the students did not seemto be highly capable of judging their peer's science writing, especiallywith respect to the depth of the biology knowledge and the accuracyof the biology concepts. Perhaps, these two dimensions require moresolid scientific knowledge when making evaluations. Moreover, thecorrelation coefficients between the peer and expert scores in Table 2display an increasing trend throughout the online peer assessmentrounds. Tsai and Liang's (2009) study found that with moreexperience or more rounds of peer assessment, the validity of thepeer scores was enhanced. Papinczak et al. (2007) also claimed thatpeer scores correlate moderately with tutor ratings initially andimprove over time. As the students acquire more experience of peerassessment, they can learn more about evaluating their peers'performance; therefore their evaluations become more in accordancewith those of their teacher, a finding shown in this study.

To understand the improvement in the students' performance dueto the online peer assessment system, this study used the peer andexpert scores across the three rounds to evaluate the score changesthroughout the activity. As this study found that the students' self-assessment did not display good validity, the progression of suchscores in the process was not included. As presented in Tables 3 and 4,it was found that, in general, the students' scores on each dimensionwere statistically higher in later rounds in light of both the expert'sand peers' points of view. In other words, through the online peerassessment, these students gained progressively higher scores alongthe three rounds; they significantly improved their science writing interms of both the expert's and peers' evaluations. These positiveeffects of peer assessment on the students' work are consistent with

Page 5: Learning through science writing via online peer assessment in a college biology course

Table 5Content analysis of students' biology writing (n=47).

(1) First round (mean, S.D.) (2) Second round (mean, S.D.) (3) Third round (mean, S.D.) Pair t-testa

Number of words 2102.00 (1408.50) 2355.68 (1459.42) 2564.70 (1481.11) (3)N(2) (t=−3.29⁎⁎)(2)N(1) (t=−2.17⁎)

Number of pictures 4.04 (3.05) 5.98 (4.33) 8.06 (7.62) (3)N(2) (t=−2.69⁎)(2)N(1) (t=−3.76⁎⁎⁎)

Number of tables 0.19 (0.57) 0.23 (0.60) 0.21 (0.59)Number of notes for pictures 2.02 (2.45) 3.09 (4.06) 5.09 (7.29) (3)N(2) (t=−2.70⁎)

(2)N(1) (t=−2.22⁎)Number of references 1.66 (2.64) 2.15 (2.48) 2.34 (2.52) (2)N(1) (t=−2.90⁎⁎)Number of subheadings 6.13 (4.24) 6.64 (3.90) 7.02 (4.00) (2)N(1) (t=−2.13⁎)

a This table only lists the tests with significant difference.⁎ pb0.05.

⁎⁎ pb0.01.⁎⁎⁎ pb0.001.

246 J.-C. Liang, C.-C. Tsai / Internet and Higher Education 13 (2010) 242–247

previous studies, showing the favorable impact of peer assessment onstudents' learning performance (Barak & Rafaeli, 2004; Cho & Schunn,2007; Tsai & Liang, 2009), particularly in writing (Cho et al., 2006; vanden Berg et al., 2006; Venables & Summit, 2003; Yang & Tsai, 2010).But this study is unique for science writing. The students in this studyexperienced both biology writing and peer assessment at the sametime, and it was found that the entire process could be satisfactorilycompleted, and the students enhanced their science writing throughthe comments from the peer evaluations. This implies that thestudents perceived improvement in their biology writing resultingfrom the online peer assessment and feedback.

To acquire a greater understanding of the improvements in thestudents' biology writing, this study further analyzed the writingcontent using some quantitative indicators, including the totalnumber of words, pictures, tables, notes for pictures, references andsubheadings in the reports. The content analyses of the students'writing as shown in Table 5 found that these students had asignificantly increasing average number of words, pictures andnotes for pictures across all three rounds. They also had a significantincrease in their use of references and subheadings from the first tothe second round. Such content analyses provide additional evidencethat these students significantly enriched their biology writing fromthe online peer assessment process. Previous studies have oftenassessed the performance of students' learning through writing byfinal course grade or examination mark (Hall, Perry, Goetz, Ruthig,Stupnisky & Newall, 2007; Freestone, 2009). Recently, someresearchers have discussed learning through writing (Cho et al.,2006; Xiao & Lucking, 2008) or science writing (Chuck & Young, 2004;Trautmann, 2009) by utilizing peer assessment, and have focused onthe gradually increasing scores in students' writing. This studysurmises that the improvement in writing by online peer assessmentmay come from the following sources. When students engage in peerreview, the practice of peer assessment may help them identify theirown writing weaknesses. Or, when reviewing peers' work, thestudents have more opportunities to carefully read some superiorwriting by their peers. These writings can serve as exemplars forenhancement. Also, the comments from peers can greatly increasewriting quality. However, successfully adapting peer comments oradequately making self-reflections throughout the online peerassessment process requires metacognitively mental acts (Tsai,2009). The learners with better metacognitive ability may thereforebenefit more from the online peer assessment process (Tsai, 2009;Yang & Tsai, 2010).

By qualitative analyses, Klein (1999) pointed out that studentsincrease the complexity of their explanations during their learningthrough science writing. However, there has not been muchquantitative research addressing the improvement in students'science writing. This study quantitatively scored and analyzed thecontent of students' writing via an online peer assessment activity and

directly presents evidence showing the refinement and enrichment ofthe students' science writing. This study can be viewed as one of theinitial attempts to utilize online peer assessment to facilitate students'learning by science writing. If possible, one more science writingproject is suggested to be implemented to examine the students' realbenefit from the online peer assessment science writing activity (thatis, to evaluate if the first draft of the second science writing shows anyimprovement over the first draft of the first science writing).

Future research can conduct more in-depth analyses of how onlinepeer assessment for science writing can help students construct/reconstruct scientific knowledge. The role of peer feedback andcomments needs further investigation. How each student evaluatesand adapts peer comments for improving science writing is also ofgreat research interest. This study was limited to a group of collegescience students. Similar learning activities can be implemented withstudents of different ages and cultural backgrounds.

Acknowledgement

Funding for this research work was provided by the NationalScience Council, Taiwan, under grants NSC 96-2511-S-011-002-MY3and NSC 98-2628-S-243-001-MY3. The authors would like to thankthe teacher, students and researchers involved in this study.

References

Barak, M., & Rafaeli, S. (2004). On-line question-posing and peer-assessment as meansfor web-based knowledge sharing in learning. International Journal of HumanComputer Studies, 61, 84−103.

Bauer, C., Figl, K., Derntl, M., Beran, P. P., & Kabicher, S. (2009). The student views ononline peer reviews. Paper presented at Annual Conference on Innovation andTechnology in Computer Science Education, Paris, France.

Beall, H. (1998). Expanding the scope of writing in chemical education. Journal ofScience Educational Technology, 7, 259−270.

Bouzidi, L., & Jaillet, A. (2009). Can online peer assessment be trusted? EducationalTechnology & Society, 12(4), 257−268.

Carlson, P. A., & Berry, F. C. (2008). Using computer-mediated peer review in anEngineering design course. Transactions on Professional Communication, 51,264−279.

Chen, Y. C., & Tsai, C. C. (2009). An educational research course facilitated by online peerassessment. Innovations in Education and Teaching International, 46, 105−117.

Cho, K., & Schunn, C. D. (2007). Scaffolded writing and reviewing in the discipline: Aweb-based reciprocal peer review system. Computers & Education, 48, 409−426.

Cho, K., Schunn, C. D., & Wilson, R. W. (2006). Validity and reliability of scaffolded peerassessment of writing from instructor and student perspectives. Journal ofEducational Psychology, 98, 891−901.

Chuck, J., & Young, L. (2004). A cohort-driven assessment task for scientific reportwriting. Journal of Science Educational and Technology, 13, 367−376.

Finegold, L. (2002). Writing for science as scholarly communication. Journal of ScienceEducational and Technology, 11, 255−260.

Freestone, N. (2009). Drafting and acting on feedback supports student learning whenwriting essay assignments. Advances in Physiology Education, 33, 98−102.

Gielen, S., Peeters, E., Dochy, F., Onghena, P., & Struyven, K. (2010). Improving theeffectiveness of peer feedback for learning. Learning and Instruction, 20, 304−315.

Hall, N. C., Perry, R. P., Goetz, T., Ruthig, J. C., Stupnisky, R. H., & Newall, N. C. (2007).Attributional retraining and elaborative learning: Improving academic development

Page 6: Learning through science writing via online peer assessment in a college biology course

247J.-C. Liang, C.-C. Tsai / Internet and Higher Education 13 (2010) 242–247

through writing-based interventions. Learning and Individual Differences, 17,280−290.

Hand, B., Hohenshell, L., & Prain, V. (2007). Examining the effect of multiple writingtasks on year 10 biology students' understandings of cell and molecular biologyconcepts. Instructional Science, 35, 343−373.

Hou, H. T., Chang, K. E., & Sung, Y. T. (2007). An analysis of peer assessment onlinediscussion within a course that uses project-based learning. Interactive LearningEnvironments, 15, 237−251.

Keys, C. W., Hand, B., Prain, V., & Collins, S. (1999). Using the science writing heuristic asa tool for learning from laboratory investigation in secondary science. Journal ofResearch in Science Teaching, 36, 1065−1084.

Klein, P. D. (1999). Learning science through writing: the role of rhetorical structures.Alberta Journal of Educational Research, 45, 132−153.

Langan, A. M., Shuker, D. M., Cullen, W. R., Penney, D., Preziosi, R. F., & Wheater, C. P.(2008). Relationship between student characteristics and self-, peer and tutorevaluations of oral presentations. Assessment & Evaluation in Higher Education, 33,179−190.

Matsuno, S. (2009). Self-, peer-, and teacher-assessments in Japanese university EFLwriting classrooms. Language Testing, 26, 75−100.

Nuckles, M., Hubner, S., & Renkl, A. (2009). Enhancing self-regulated learning bywriting learning protocols. Learning and Instruction, 19, 259−271.

Orsmond, P., Merry, S., & Reiling, K. (2000). The use of student derived marking criteriain peer and self-assessment. Assessment & Evaluation in Higher Education, 25(1),23−38.

Papinczak, T., Young, L., Grove, M., & Haynes, M. (2007). An analysis of peer, self, andtutor assessment in problem-based learning tutorials. Medical Teacher, 29,e122−e132.

Prain, V. (2006). Learning from writing in secondary science: Some theoretical andpractical implications. International Journal of Science Education, 28, 179−201.

Rivard, L. P., & Straw, S. B. (2000). The effect of talk and writing on learning science: Anexploratory study. Science & Education, 84, 566−593.

Sluijsmans, D., Dochy, F., & Moerkerke, G. (1998). Creating a learning environment byusing self-, peer- and co-assessment. Learning Environments Research, 1, 293−319.

Topping, K. J. (1998). Peer assessment between students in college and universities.Review of Educational Research, 68, 249−276.

Topping, K. J. (2008). Peer assessment. Theory into Practice, 48, 20−27.Trautmann, N. M. (2009). Interactive learning through web-mediated peer review of

student science reports. Educational Technology Research and Development, 57,685−704.

Tsai, C. -C. (2009). Internet-based peer assessment in high school settings. In L. T. W.Hin, & R. Subramaniam (Eds.), Handbook of Research on New Media Literacy at theK-12 level: issues and challenges (pp. 743−754). Hershey, PA: Information ScienceReference.

Tsai, C. -C., & Liang, J. C. (2009). The development of science activities via on-line peerassessment: The role of scientific epistemological views. Instructional Science, 37,293−310.

Tsai, C. -C., Liu, E. Z. F., Lin, S. S. J., & Yuan, S. M. (2001). A networked peer assessmentsystem based on a Vee heuristic. Innovations in Education and TeachingInternational, 38, 220−230.

Tsai, C. -C., Lin, S. S. J., & Yuan, S. M. (2002). Developing science activities through anetworked peer assessment system. Computers & Education, 38, 241−252.

Tseng, S. C., & Tsai, C. C. (2007). On-line peer assessment and the role of the peerfeedback: A study of high school computer course. Computers & Education, 49,1161−1174.

Tseng, S. C., & Tsai, C. -C. (2010). Taiwan college students' self-efficacy andmotivation oflearning in online peer assessment environments. The Internet and HigherEducation, 13, 164−169.

van den Berg, I., Admiraal,W., & Pilot, A. (2006). Design principles and outcomes of peerassessment in higher education. Studies in Higher Education, 31, 341−356.

Venables, A., & Summit, R. (2003). Enhancing scientific essay writing using peerassessment. Innovations in Education and Teaching International, 40, 281−290.

Wen, L. M. C., & Tsai, C. C. (2006). University students' perceptions of and attitudestoward (online) peer assessment. Higher Education, 51, 27−44.

Wen, L. M. C., & Tsai, C. C. (2008). Online peer assessment in an inservice science andmathematics teacher education course. Teaching in Higher Education, 13, 55−67.

Xiao, Y., & Lucking, R. (2008). The impact of two types of peer assessment on students'performance and satisfaction within a Wiki environment. Internet and HigherEducation, 11, 186−193.

Yang, Y. F., & Tsai, C. C. (2010). Conceptions of and approaches to learning throughonline peer assessment. Learning and Instruction, 20, 72−83.