6
 Fig. 1. Progression pathways through the programming units. Developing Assessment Criteria for Portfolio Assessed Introductory Programming Andrew Cain Faculty of Information and Communication Technologies, Swinburne University of Technology Hawthorn, Victoria, Australia Email: [email protected] du.au  Abstract — Constructive alignment aims to improve learning outcomes by focusing on what the student does. This work exam- ines the development of assessment criteria for an introductory programming unit that used portfolio assessment to implement constructive alignment. After initial setbacks, effective assess- ment criteria were identified which enabled quick, accurate, as- sessment of student portfolios. Pass rates improved over the peri- od of the research, and portfolios for higher grades demonstrated students’ ability to apply the concepts learnt and to carry out small research projects. The current state of the assessment crite- ria is presented, and the work continues through reflective teach- ing practice.  Keywords—intr oductory program ming; con structive a lignment;  portfolio assessmen t I. I  NTRODUCTION Biggs’ early work on constructive alignment [1,2] present- ed portfolio assessment as a means of encouraging students to use deep approaches to learning. Subsequently, little research has reported similar approaches. Our previous work presented a model of constructive alignment with portfolio assessment for teaching introductory programming [3], and examined is- sues students raised in units delivered using this approach [4]. One of the main challenges discussed in previous work [3] was the development of suitable assessment criteria. This work is part on an ongoing initiative that aims to improve construc- tively aligned portfolio assessed units (see also [5,6]). The work presented in this paper outlines nine iterations of an ac- tion research project, reporting the criteria used, student results, staff reflections, and analysis for each iteration. It is hoped that these reflections will provide other academ- ics with the tools, and motivation, to trial and report on portfo- lio assessment in other contexts. The Action Research Method section of this paper outlines the research method used, followed by a description of the units in which this research took place. The Iteration section then presents relevant details from the nine iterations, and de- tails of the focus and plans for the tenth iteration. The paper then concludes with Discussion and Conclusions. II. ACTION RESEARCH METHOD Due to the practical nature of this research and its focus on student learning, it was decided to follow a Practical Action Research [7] design based on Mills’ [8] dialectic action re- search spiral. This model, shown in Fig. 1, includes a four-step  process: (1) ide ntify an area o f focus, (2) co llect data, (3) ana - lyze and interpret the data, and (4) develop an action plan. The Iterations section documents the focus,  plan, data, and analysis and reflections per iteration. In this action research project iterations align to semesters. In each semester one or more introductory programming units was delivered using the portfolio assessment approach from [3]. Nine iterations were completed over five years, involving thirteen unit deliveries, with a total of 983 portfolios assessed. The focus for this research was the development of assess- ment criteria to support grading of student portfolios in intro- ductory programming units. The iterative nature of the action research process meant that each semester focused on address- ing aspects of the assessment criteria, and thereby helped ad- dress the overall goal of the research. Data collection included student grades and analysis of unit documentation and staff reflections. As student portfolios were assessed in order to generate student grades, they were not di- rectly analyzed as part of the data collection. Unit documentation included the Unit Outline and Unit Re- view documents. The Unit Outline included intended learning outcomes and assessment criteria, and was provided to students  prior to the co mmencement of eac h semester. The Unit Review document was created after results were reported, and captured details of student perceptions of the teaching, teaching and learning approach, results, unit management, and any planned changes for future semesters. Staff reflections indicated the qualities exhibited in the stu- dent portfolios for a given semester. Staff reflections were cap- 978-1-4673-6354-9/13/$31.00 ©2013 IEEE 26-29 August 2013, Bali Dynasty Resort, Kuta, Indonesia 2013 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE) Page 55

Example 2.pdf

Embed Size (px)

Citation preview

Page 1: Example 2.pdf

8/10/2019 Example 2.pdf

http://slidepdf.com/reader/full/example-2pdf 1/6

 

Fig. 1. Progression pathways through the programming units.

Developing Assessment Criteria for Portfolio

Assessed Introductory Programming

Andrew Cain

Faculty of Information and Communication Technologies, Swinburne University of TechnologyHawthorn, Victoria, Australia

Email: [email protected] 

 Abstract— Constructive alignment aims to improve learning

outcomes by focusing on what the student does. This work exam-

ines the development of assessment criteria for an introductory

programming unit that used portfolio assessment to implement

constructive alignment. After initial setbacks, effective assess-

ment criteria were identified which enabled quick, accurate, as-

sessment of student portfolios. Pass rates improved over the peri-

od of the research, and portfolios for higher grades demonstrated

students’ ability to apply the concepts learnt and to carry out

small research projects. The current state of the assessment crite-ria is presented, and the work continues through reflective teach-

ing practice.

 Keywords—introductory programming; constructive alignment;

 portfolio assessment

I. 

I NTRODUCTION 

Biggs’ early work on constructive alignment [1,2] present-ed portfolio assessment as a means of encouraging students touse deep approaches to learning. Subsequently, little researchhas reported similar approaches. Our previous work presenteda model of constructive alignment with portfolio assessmentfor teaching introductory programming [3], and examined is-

sues students raised in units delivered using this approach [4].

One of the main challenges discussed in previous work [3]was the development of suitable assessment criteria. This workis part on an ongoing initiative that aims to improve construc-tively aligned portfolio assessed units (see also [5,6]). Thework presented in this paper outlines nine iterations of an ac-tion research project, reporting the criteria used, student results,staff reflections, and analysis for each iteration.

It is hoped that these reflections will provide other academ-ics with the tools, and motivation, to trial and report on portfo-lio assessment in other contexts.

The Action Research Method section of this paper outlines

the research method used, followed by a description of theunits in which this research took place. The Iteration sectionthen presents relevant details from the nine iterations, and de-tails of the focus and plans for the tenth iteration. The paperthen concludes with Discussion and Conclusions.

II. 

ACTION RESEARCH METHOD

Due to the practical nature of this research and its focus onstudent learning, it was decided to follow a Practical ActionResearch [7] design based on Mills’ [8] dialectic action re-

search spiral. This model, shown in Fig. 1, includes a four-step

 process: (1) identify an area of focus, (2) collect data, (3) ana-lyze and interpret the data, and (4) develop an action plan. TheIterations section documents the focus, plan, data, and analysisand reflections per iteration.

In this action research project iterations align to semesters.In each semester one or more introductory programming unitswas delivered using the portfolio assessment approach from[3]. Nine iterations were completed over five years, involving

thirteen unit deliveries, with a total of 983 portfolios assessed.

The focus for this research was the development of assess-ment criteria to support grading of student portfolios in intro-ductory programming units. The iterative nature of the actionresearch process meant that each semester focused on address-ing aspects of the assessment criteria, and thereby helped ad-dress the overall goal of the research.

Data collection included student grades and analysis of unitdocumentation and staff reflections. As student portfolios wereassessed in order to generate student grades, they were not di-rectly analyzed as part of the data collection.

Unit documentation included the Unit Outline and Unit Re-

view documents. The Unit Outline included intended learningoutcomes and assessment criteria, and was provided to students

 prior to the commencement of each semester. The Unit Reviewdocument was created after results were reported, and captureddetails of student perceptions of the teaching, teaching andlearning approach, results, unit management, and any plannedchanges for future semesters.

Staff reflections indicated the qualities exhibited in the stu-dent portfolios for a given semester. Staff reflections were cap-

978-1-4673-6354-9/13/$31.00 ©2013 IEEE 26-29 August 2013, Bali Dynasty Resort, Kuta, Indonesia2013 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE)

Page 55

Page 2: Example 2.pdf

8/10/2019 Example 2.pdf

http://slidepdf.com/reader/full/example-2pdf 2/6

 

Fig. 2. Progression pathways through the programming units.

tured both during the semester and after the portfolios wereassessed. These reflections were recorded in notes, and inmany cases were reported in the Unit Review document.

Student grades provide an indication of how well students performed in the given semester. Together with the staff reflec-tions, these results provide insight into the learning outcomesstudents achieved – insights not available by considering stu-dents grades alone.

III.  THE UNITS

This research examines the development and use of as-sessment criteria for a number of units. Table I shows the fourdifferent programming units that were part of this work, andthe iterations in which they were involved. All of the unitswere taken by undergraduate students early in their degree pro-gram and were convened by the author. General details of thefour units follow, and any changes to individual iterations are

 presented in the following sections.

 A.  Introductory Programming (A)

Introductory Programming (A) was taken by students in

their first semester and introduced them to procedural pro-gramming. The Intended Learning Outcomes (ILOs) includedthe ability to read and interpret code, write small procedural

 programs, iteratively use modular and functional decomposi-tion, and the ability to apply the principles of structured pro-gramming (focusing on blocks of code and using sequence,selection, and repetition). All of the ILOs were expressed in alanguage neutral manner as the focus of the unit being on theunderlying programming concepts.

Introductory programming (A) was taken by studentsstudying a range of degrees, with most being enrolled in aBachelor of Science, majoring in Computer Science, Profes-sional Software Development or Games Development.

 B. 

Object Oriented Programming (A)

Object Oriented Programming (A) had Introductory Pro-gramming (A) as a prerequisite and was taken by students intheir second semester. The ILOs in this unit required studentsto design, develop and test object oriented programs, as well ascommunicate the underlying principles of abstraction, encapsu-lation, inheritance and polymorphism. As with IntroductoryProgramming (A), the ILOs were expressed in a language neu-tral manner and the focus was underlying concepts.

The student cohort in Object Oriented Programming (A)consisted only of students that had completed IntroductoryProgramming (A).

C. 

 Introductory Programming (B)Prior to iteration seven this unit was taught using a textbook

style approach with assignments and a final exam. In iterationseven the unit was adapted to a portfolio-based approach.

The ILOs for Introductory Programming (B) covered simi-lar topics to Introductory Programming (A) but with specificreference to the C programming language. The ILOs from In-troductory Programming (A) were used after the shift to the

 portfolio approach and the focus shifted to programming con-cepts and principles. The unit continued to use C as its pro-gramming language for all tasks.

The cohort of Introductory Programming (B) included stu-dents from a range of degree program. This included studentsstudying for a Bachelor of Information and CommunicationTechnology, Bachelor of Engineering and Bachelor of Science

(Computer Science and Software Engineering). The unit wasincluded in a number of other degrees as an elective.

 D.  Object Oriented Programming (B)

As with Introductory Programming (B), Object OrientedProgramming (B) was taught using a specific language (C++),used a textbook style approach, traditional assignments andfinal exam. This unit covered the same topics as Object Orient-ed Programming (A), and in iteration nine the two object ori-ented programming units were combined into a single unit.This combined unit used portfolio assessment and its ILOsfocused on programming concepts and principles. Studentscontinued to enroll in the individual units, but were taught as asingle cohort. Students enrolled in Object Oriented Program-

ming (B) were required to include evidence of being able toapply the unit’s concepts using the C++ language.

 E.  Relationship Between Units

Fig. 2 shows progression paths through these units. Thestudents broadly classified as having a software developmentfocus took Introductory Programming (A) in the first semesterof their first year, and then Object Oriented Programming (A)in the second semester of their first year. Introductory Pro-gramming (B) was taken primarily by Engineering students,who subsequently took an intermediate programming unit be-fore studying Object Oriented Programming (B). For the Engi-neering students, this sequence may be extended over morethan three consecutive semesters depending on their degree

 program.

IV.  THE ITERATIONS 

 A.  Iteration 1

1) Focus: This was a first attempt at portfolio assessment,and so the focus for the first iteration was on implementing

 portfolio assessment in general.

978-1-4673-6354-9/13/$31.00 ©2013 IEEE 26-29 August 2013, Bali Dynasty Resort, Kuta, Indonesia2013 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE)

Page 56

Page 3: Example 2.pdf

8/10/2019 Example 2.pdf

http://slidepdf.com/reader/full/example-2pdf 3/6

2) Action Plan: The approach used was an iterative step to-ward the general model presented in our previous work [3].Students were required to complete six assignments and sixtests, with the option to include a portfolio. The aim of theassessment strategy was for students to demonstrate their un-derstanding of core concepts in the assignments and tests; the

 portfolio was used to determine student ability in the higher

grade brackets. Appropriate weights were applied to each ofthe assessment items and these were added together to calcu-late the final grade.

Key differences from the portfolio model presented in [3]:

•  The unit included a total of eleven ILOs, each makinguse of active verbs and relating to specific parts of theunit.

•  It used multiple assignments (six) over the semester.

•  It included tests that were marked and contributed to thefinal grade.

•  Submission of a portfolio was optional. All studentswho submitted a portfolio were interviewed. Similari-

ties with later portfolio iterations:•  It included criteria for each grade, though these were

described in general terms.

• 

Students included a self-assessment against the criteria.

3) Data: Unit results across all iterations are shown in Table II.This lists the number of students receiving each grade over thenine iterations. Grades indicate students who enrolled but didnot submit a portfolio (NA) those who failed (N) and thosewho received Pass (P) Credit (C) Distinction (D) and HighDistinction (HD) grades. The results for Iteration 1 are shown

in Fig. 3. In this iteration a large percentage of students man-aged to receive an HD grade.

4) Reflections and Analysis: Staff reflections included severalinteresting aspects related to the assessment:

•  There were too many ILOs, and students found it diffi-cult to relate their portfolio pieces back to these.

•  Criteria were difficult to apply, being weakly defined.

Students weakened the criteria further in their self-assessment.

•  Assessing the portfolios was very time consuming.

•  Core assignments and tests:

-  Covered the minimum expectations for the ILOs.

-  Received high marks, which indicated coverage ofILOs.

•  Students with weak portfolios were able to receivedhigh grades.

The use of portfolios was limited in this iteration, with positive and negative results. The two main issues were theweakness of the expressed assessment criteria and the combin-ing of results from the assignments and tests. Together, these

issues resulted in many students receiving a higher grade thanstaff felt was appropriate. The high portion of students withHD grades supports this perception.

Overall, it was felt that portfolio assessment offered great potential, but that this iteration had not managed to create asuitable environment in which these benefits could be realized.

 B.  Iteration 2

1) Focus: This iteration included the delivery of ObjectOriented Programming (A), and aimed to address three of themain concerns from Iteration 1: having  fewer ILOs  aroundwhich everything would be based, assessing the unit as a wholewith 100% portfolio assessment , and specific assessment crite-

ria expressing what was expected for each grade.

2) Action Plan: In this iteration the following aspects of themodel reported in [3] were included:

•  Unit outcomes were rested to use five ILOs.

•  Assessment criteria were developed for each grade us-ing the different levels from the SOLO taxonomy [9].This was presented in a format similar to that shown inFig. 7, though some details differed.

•  Feedback was provided using weekly formative as-sessments, and tests.

TABLE I. U NITS IN EACH ITERATION 

TABLE II. U NITS R ESULTS ACROSS ITERATIONS 1 TO 9

Fig. 3. Result distributions from iterations 1 and 2

978-1-4673-6354-9/13/$31.00 ©2013 IEEE 26-29 August 2013, Bali Dynasty Resort, Kuta, Indonesia2013 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE)

Page 57

Page 4: Example 2.pdf

8/10/2019 Example 2.pdf

http://slidepdf.com/reader/full/example-2pdf 4/6

The following aspects differed:

•  A flipped classroom model [10,11] was adopted: stu-dents were provided with online videos covering theweekly lecture material and classroom activities were

 predominantly interactive.

•  Assessment criteria were adapted from [12] (pg. 226)and indicated how each ILO could be met to a Margin-

al, Adequate, Good or Excellent standard.• 

A C grade required three ILOs to be addressed at anAdequate standard, D required two at a Good standard(with all other Adequate) and HD required two Excel-lent and all others Good.

3)  Data: Fig. 3 shows the result distributions for OOP (A)in Iteration 2. The HD results were closer to expectations,though the pass rate was a cause for concern.

4) Reflections and Analysis: Key staff reflections included:

•  Interviewing all students meant that portfolio assess-ment was very time consuming.

•  The general structure of the assessment criteria was

suitable.•  There was a disconnect in perceived standard: the inter-

 pretation of “good” was significantly different betweenstaff and students.

•  Work was of a weaker standard than desired across allgrades.

•  Students did not benefit from the classroom flip, withfew preparing adequately for the classroom discussions.

Staff felt that most of the issues from the semester could beattributed to non-productive “discover learning” [13]. In theeffort to implement constructive learning theories staff hadreduced the amount of guided learning activities, and student

 productivity appears to have been adversely affected.

It was still felt that portfolio assessment could be beneficial but that, again, this iteration had failed to realize any benefits.In many ways, this semester’s results had felt like a backwardstep.

C.  Iterations 3 to 6

1) Focus: Iterations 3 to 6 used a similar approach to as-sessment as presented in iteration 2, and worked at identifyingsuitable criteria in each of the various categories for each of theILOs.

2) Plan: Each iteration adopted the changes from prior iter-ations, and made iterative adaptations to wording to help better

capture the intentions of each grade criteria. Specific changesincluded:

1) Iteration 3:

•  Adjusted the main category descriptors to: Adequate,Good, Outstanding, and Exemplary.

• 

The classroom flip was dropped, with videos now beingused to support classroom activity.

•  A reflective report was added that included the self-assessment: an alignment of the pieces to the ILOs, andgeneral reflections.

2) Iteration 4:

• 

Reduced the number of items expected for each grade:C required one ILO to be met to a Good standard; D ex-tended C with another ILO met at Outstanding; and HDextended D with a further ILO met at Exemplary.

3) Iteration 5:

•  P and C students were no longer interviewed.

•  Tests became a hurdle requirement, and had to be com- pleted to a satisfactory standard for students to be eligi- ble to pass the unit.

4) Iteration 6:

•  To meet the Outstanding standard for an ILO, studentswere required to develop a program of their own designand relate this to that ILO.

•  Meeting the Exemplary standard required a research re- port related to the ILO.

3) Data: Fig. 4 shows the grade distributions for Introducto-ry Programming (A). The pass rate improved over these itera-

tions: from 69% in Iteration 1, to 72%, 77%, then 80% in Itera-

 

Fig. 4. Grade distributions for Introductory Programming (A) fromiterations 1 3 5 6 and 8.

Fig. 5. Grade distributions for Object Oriented Programming (A)from iterations 2, 4, 7 and 9.

978-1-4673-6354-9/13/$31.00 ©2013 IEEE 26-29 August 2013, Bali Dynasty Resort, Kuta, Indonesia2013 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE)

Page 58

Page 5: Example 2.pdf

8/10/2019 Example 2.pdf

http://slidepdf.com/reader/full/example-2pdf 5/6

Page 6: Example 2.pdf

8/10/2019 Example 2.pdf

http://slidepdf.com/reader/full/example-2pdf 6/6

V.  DISCUSSION

This paper presented results and analysis from 9 iterationsof an action research project that examined the implementationof portfolio assessment. The overall focus of this work was on

adapting assessment criteria to ensure that grades had clearexpectations, could be assessed in a timely manner, with highergrades representing increased depth of understanding.

Initial attempts at portfolio assessment failed to appropri-ately capture staff expectation, and student portfolios weregenerally weaker than desired. Over subsequent iterations, theassessment criteria evolved to more tightly define what wasexpected of each grade and students portfolios improved tomeet these expectations. Staff are now able to quickly assess

 portfolios, and feel that grades accurately reflect student out-comes.

This work provides additional evidence of the strength of

 portfolio assessments for achieving constructive alignment. Ineach iteration the assessment criteria helped guide students in preparing their portfolios, and as the criteria evolved the evi-dence in student portfolios improved. Experience delivering

 portfolio assessed units resulted in criteria that clearly relatedto active verbs at the multi-structural and relational levels ofthe SOLO taxonomy.

These experiences highlight the importance of ensuringILOs are expressed clearly, and capture the core concepts and

 principles that need to be demonstrated in student portfolios.The developed assessment criteria provide statements of re-quired levels of achievement, which together with the ILOsexpress what needs to be done, and how well it needs to be

done, to achieve different grade outcomes.VI.  CONCLUSION

Action research is an effective means to iteratively improveassessment criteria. Staff found this particularly useful in de-veloping and refining the criteria for portfolio assessment, withthe developed criteria accurately reflecting student learningoutcomes.

Portfolio assessment is strongly recommended as a meansof achieving constructive alignment. The assessment criteria

discussed in this paper enabled portfolio assessment to be per-formed accurately and efficiently, with later iterations involv-ing hundreds of students. Portfolio assessment enabled staffand student efforts to be directed toward the one goal: helping

students achieve the intended learning outcomes to the best oftheir ability. 

R EFERENCES 

[1] 

J. Biggs, “Enhancing teaching through constructive alignment,” Higher

Education, vol. 32, pp. 347–364, 1996.

[2] 

J. Biggs and C. Tang, “Assessment by portfolio: Constructing learningand designing teaching,” Research and Development in Higher Educa-

tion, pp. 79–87, 1997.

[3] 

A. Cain and C. J. Woodward, “Toward constructive alignment with

 portfolio assessment for introductory programming,” in Proceedings ofthe first IEEE International Conference on Teaching, Assessment and

Learning for Engineering. IEEE, 2012, pp. 345–350.

[4]   ——, “Examining student reflections from a constructively alignedintroductory programming unit,” in Proceedings of the 15th Australasian

Computer Education Conference, vol. 136, 2013, pp. 129–136.[5]

 

A. Cain, C. J. Woodward, and S. Pace, “Examining student progress in portfolio assessed introductory programming,” in Proceedings of the 2nd

IEEE International Conference on Teaching, Assessment and Learningfor Engineering. IEEE, 2013, in press.

[6] 

C. J. Woodward, A. Cain, S. Pace, A. Jones, and J. Funke Kupper,

“Helping students track learning progress using burn down charts,” inProceedings of the 2nd IEEE International Conference on Teaching,

Assessment and Learning for Engineering. IEEE, 2013, in press.

[7]  J. W. Creswell, Educational Research: Planning, Conducting, andEvaluating Quantitative and Qualitative Research. Upper Saddle River,

 N.J.: Pearson/Merrill Prentice Hall, 2008.

[8] 

G. E. Mills, Action Research: A Guide for the Teacher Researcher, 4thed. Pearson, 2010.

[9] 

J. B. Biggs and K. F. Collis, Evaluating the Quality of Learning: The

SOLO Taxonomy (Structure of the Observed Learning Outcome).Academic Press New York, 1982.

[10] 

J. W. Baker, “The classroom flip. using web course management tools to become the guide on the side,” in 11th International Conference on

College Teaching and Learning, Jacksonville, FL, 2000.

[11] 

M. J. Lage and G. Platt, “The internet and the inverted classroom,” TheJournal of Economic Education, vol. 31, no. 1, pp. 11–11, 2000.

[12]  J. B. Biggs and C. Tang, Teaching for quality learning at university, 3rd

ed. Open University Press, 2007.

[13] 

J.R. Anderson, L. M. Reder, H. A. Simon, K. A. Ericsson, and R. Glaser,“Radical constructivism and cognitive psychology,” Brookings Papers

on Education Policy, no. 1, pp. 227–278, 1998.

Fi . 7. Assessment criteria from Unit Outline for current iteration

978-1-4673-6354-9/13/$31.00 ©2013 IEEE 26-29 August 2013, Bali Dynasty Resort, Kuta, Indonesia2013 IEEE International Conference on Teaching, Assessment and Learning for Engineering (TALE)

Page 60