Eng. Nisansa Dilushan de Silva, Dr. Shahani Markus Weerawarana, Dr. Amal Shehan Perera
Presented by– Nisansa Dilushan de Silva
Department of Computer Science and Engineering University of Moratuwa
Sri Lanka
Enabling Effective Synoptic Assessment via Algorithmic
Constitution of Review Panels
IEEE International Conference on Teaching, Assessment, and Learning for Engineering(TALE 2013)
• Introduction•Challenges in creating assessment panels•Proposed Solution•Results•Discussion•Conclusion•Future work and Recommendation
Outline
• 5th semester• Creativity & Software Engineering rigor • Pre-industrial training (6th semester )
• Is comparable to a mini Capstone project• Course design straddles several program ILOs [3]
Introduction - Software Engineering Project (CS 3202) [2]
[2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012.[3] Anderson, L. & Krathwohl, D. A. (2001). Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives New York: Longman.
• A large number of students (n=101)• Assessment of the end-of-semester project
demonstrations• Technical standpoint• Creative standpoint
• Needs a synoptic assessment methodology so that it “enables students to integrate their experiences, providing them with important opportunities to demonstrate their creativity”[4]
Software Engineering Project- Challenges
[4] Jackson, N. (2003). Nurturing creativity through an imaginative curriculum, Imaginative curriculum Project, Learning and Teaching Support Network, Higher Education Academy.
• Student projects cover a large spectrum of technologies. How to evaluate all of them fairly? • Is it enough to use university academic staff only? [2]
• Involve external evaluators from the industry each being an expert of some of the technologies used by students?
• We went with the second approach by agreeing with Elton that assessment of creative work should be ‘viewed in light of the work’, highlighting important aspects as, “the ability of experts to assess work in their own field of expertise” and “the willingness to employ judgment”. [5]
Software Engineering Project- Challenges
[2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012.[5] Elton, L. (2005). Designing assessment for creativity: an imaginative curriculum guide. York: Higher Education Academy (in offline archive).
• Is one evaluator matched according to technological expertise enough to fairly evaluate a student project?• Creative projects -> Evaluation is subjective• Thus might not be a fair judgement
• We decided to use panels of evaluators in accordance to what Balchin suggests when he states that “consensual assessment by several judges” can be used to enhance the reliability of subjective evaluation. [6]
Software Engineering Project- Challenges
[6] Balachin, T. (2006). Evaluating creativity through consensual assessment, in N. Jackson, M. Oliver, M. Shaw and J. Wisdom (eds) Developing Creativity in Higher Education: Imaginative Curriculum. Abingdon: Routledge.
• Mapping individual evaluators to student projects according to the expertise in technology was fairly an easy task…
Creating Evaluation Panels - Challenges
• But with panels it is not so easy to assign the ‘best fit’ panel of evaluators comprising external industry experts and internal faculty members considering the technologies used in the student’s project.
Creating Evaluation Panels - Challenges
?
•Multiple and often conflicting evaluator availability constraints• Balance of external versus internal evaluators in the panels•Minimization of the number of panel composition reshuffles• Avoidance of the same internal evaluator who assessed the mid-semester project demonstration being included in the end-semester evaluation panel• Preventing internal evaluators who mentored specific projects being included in the end-semester evaluation panel for the same projects
Creating Evaluation Panels - Challenges
Introduce an algorithm to Automate the Composition and Scheduling Process for the Synoptic Assessment
Panels
Solution
• Internal evaluator: conflict data (Matrix C)• Was a mentor• Was the mid-semester evaluator
More Inputs….
I. Evaluator 1 I. Evaluator 2 ... I. Evaluator 7
Student 1 1 0 ... 0Student 2 0 0 ... 1... ... ... ... ...Student101 1 1 ... 0
• Evaluator availability ( Matrix D )• Some are available throughout the day• Some arrive late• Some plan to leave early
More Inputs….
Time slot 1 Time slot 2 ... Time slot 24
I. Evaluator 1 1 1 ... 0I. Evaluator 2 1 1 ... 1... ... ... ... ...I. Evaluator 7 1 1 ... 0E. Evaluator 1 1 1 ... 1E. Evaluator 2 0 0 ... 1... ... ... ... ...E. Evaluator 9 0 1 ... 0
•Other constraints in place to ensure a fair evaluation being taken place [2]
• Grading calibration• Minimum number of evaluators in a panel• At least one internal evaluator in a panel
More Inputs….
[2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012.
• Calculate the student technology request feasibility
Algorithm: Rough outline
𝐸=𝐴 ∙𝐵𝑇
E. Evaluator 1
E. Evaluator 2
... E. Evaluator 9
Student 1
0 1 ... 0
Student 2
1 2 ... 1
... ... ... ... ...Student101
1 1 ... 1
• Create a set of panels with a randomization algorithm• Assign panel slots to created panels• Apply internal evaluator constraints.• Calculate the merit value for each panel slot against
each project (Matrix F)
Algorithm: Rough outline
𝑽 𝑷𝒓𝒐 , 𝑰𝒏𝒕=∏𝒌=1
𝒑
𝑪 𝒊 ,𝒌 𝑽 𝑷𝒓𝒐 ,𝒆𝒙𝒕=∑𝒍=1
𝒎
𝑬 𝒊 ,𝒍
𝑽 𝑷𝒓𝒐 ,𝒈−𝒑𝒂𝒏=𝑽 𝑷𝒓𝒐 , 𝑰𝒏𝒕∗𝑽 𝑷𝒓𝒐 ,𝒆𝒙𝒕
𝑽 𝑷𝒓𝒐 ,𝒏−𝒑𝒂𝒏=𝒘 1∗𝑽 𝑷𝒓𝒐 ,𝒈−𝒑𝒂𝒏+𝒘2
Panel-slot 1 Panel-slot 2 ... Panel-slot 108
Student 1
20 4 ... 0
Student 2
9 32 ... 65
... ... ... ... ...Student101
0 19 ... 16
• Create the The inverted value matrix (Matrix G)• Run the well-known combinatorial optimization
algorithm known as the “The Hungarian algorithm” [7] on the above data. ( Results in the Boolean Matrix H)
Algorithm: Rough outline
[7] Kuhn, H. W. (1955). The Hungarian method for the assignment problem, Naval Research Logistics Quarterly, 2:83–97
• The underlying randomness of the base ‘seed’ evaluation panels that was fed in to the core algorithm might introduce a certain degree of unfairness to the system.• To fix this we introduced a second layer of indirection which was placed over the core algorithm.• Thus the result was now a pool of panel assignment schedules instead of a single schedule.• The best one of which was selected as the winning schedule.
Algorithm: Removing the randomness
Algorithm: Removing the randomness
𝑽=∑𝒊=1
101
∑𝒔=1
108
𝑯𝒊 ,𝒔∗𝑭 𝒊, 𝒔
1 6 11 16 21 26 31 36 41 46400
420
440
460
480
500
520
2 234 466 698 930 1162139416261858400
420
440
460
480
500
520
Algorithm pool results. (Left: pool size=50, right: pool size =2000) x-axis: epochs, y-axis: total value of assignment schema
• The algorithm was able to match 120 of the total 141 feasible requests giving an 85.12% success rate. • The average number of requests satisfied per student was 92.1% among the 83 students whose requests constituted the 141 feasible requests. • The average number of requests satisfied per technology was 71.69% among the 18 technologies.
Results
• The advantage of automating the panel composition process • When some external evaluators make sudden changes in their
time constraints • Cancellation of their commitment mere hours prior to the
commencement of the project demonstrations.
• In this critical situation the algorithm facilitated rapid recalculation that produced an alternate optimal schedule
Discussion
This approach can be considered as a major improvement over the manual
assignment of panels for Synoptic Assessment.
Conclusion
The future work is to implement an online application.
Our recommendation is that other educators could use this application
for a similar purpose.
Future work & Recommendation
[1] TALE 2012 Presentation, Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012.
[2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012.
[3] Anderson, L. & Krathwohl, D. A. (2001). Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives New York: Longman.
[4] Jackson, N. (2003). Nurturing creativity through an imaginative curriculum, Imaginative curriculum Project, Learning and Teaching Support Network, Higher Education Academy.
[5] Elton, L. (2005). Designing assessment for creativity: an imaginative curriculum guide. York: Higher Education Academy (in offline archive).
[6] Balachin, T. (2006). Evaluating creativity through consensual assessment, in N. Jackson, M. Oliver, M. Shaw and J. Wisdom (eds) Developing Creativity in Higher Education: Imaginative Curriculum. Abingdon: Routledge.
[7] Kuhn, H. W. (1955). The Hungarian method for the assignment problem, Naval Research Logistics Quarterly, 2:83–97
References