1
Macro Data for Micro Learning: Developing FUN! For Automated Assessment of Computational Thinking in Scratch Taylor Martin & Deborah Fields Introduction Methods Methods Cont. References In this exploratory research project funded by an NSF Cyberlearning grant we are seeking to advance the understanding of personalized learning in constructionist technology- enhanced environments by capitalizing on the capability of automated assessments to make that learning visible. Specifically our research focuses on the programming environment called Scratch. We will be introducing children to Scratch in constructionist workshops in which children themselves set goals that require complex thinking and learning. The challenge arises in making this learning visible to students, teachers, parents, and educational stakeholders. Currently we are in the process developing the second half of the Fun! Tool which will automatically analyze students projects based on several programming concepts identified to be important to novice programmers. For example Figure 6 shows a visualization of a student’s Scratch project using Tableau. This visualization shows how the scripts in the project are all connected either using the ‘Green Flag’ (green dot and lines) or ‘Broadcasts’ (orange lines). Based on the results of our first set or workshops we have made revisions to the Fun! Tool, and we are currently conducting our second set of workshops. These workshops will be finished by the end of July 2014. Afterwards an even more indepth analysis will begin. Results will follow, and final adjustments to the Fun! Tool will be made. In addition we plan to create a GUI interface for the Fun! Tool that will include a dashboard for other researchers and teachers to use (see Figure 8). Denner, J., Werner, L., & Ortiz, E. (2011). Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts? Computers and Education, 58, 240–249. Grover, S., & Pea, R. (2013). Computational Thinking in K–12: A Review of the State of the Field. Educational Researcher, 42(1), 38–43. doi:10.3102/0013189X12463051 Maloney, J. H., Peppler, K., Kafai, Y., Resnick, M., & Rusk, N. (2008). Programming by choice: urban youth learning programming with scratch. ACM SIGCSE Bulletin, 40(1), 367– 371. Resnick, M., Maloney, J., Monroy-Hernández, A., Rusk, N., Eastmond, E., Brennan, K., … others. (2009). Scratch: programming for all. Communications of the ACM, 52(11), 60– 67. Analysis Next Steps Scratch Environment Scratch was created by the MIT Media Lab as an introductory programming environment for children. Scratch was designed to quickly and easily make creative projects without having prior programming knowledge, while at the same time allowing for the use of complex programming concepts such as variables and conditionals (Grover & Pea, 2013). Children choose a character called a Sprite, and then program the sprite using programming commands called blocks that snap together like Legos (Resnick et al, 2009). See Figure 1. Figure 1: Sample Student Scratch Project Figure 2: Study Design Figure 3: Workshop Number 1 Participants Figure 4: Workshop Instruction Design Figure 5: Design Challenge Descriptions Figure 8: Sample Dashboard Figure 6: Visualization of a Students Scratch Project The first workshop was conducted in the Fall of 2013 (see Figure 3), and resulted in the collection of approximately 11,700 saves of student projects that we are currently analyzing. During workshop number one participants engaged in a total of 10 hours of introductory programming. Workshops followed a studio arts model of instruction, which included opportunities for personal expression and exploration while providing structure through the use of design challenges (see Figures 4 & 5). To answer our research questions we are developing a tool called the Fun! Tool. The Fun! Tool has two purposes. First, the Fun! Tool will collect snap shots of children’s Scratch code at regular intervals so that we can see at a more fine grained level their progress over time. Second the Fun! Tool will analyze the code and give a report on how each student is progressing. To provide a testing ground for the Fun! Tool we are conducting a series of Scratch workshops for novice programmers between grades 5-8 (see Figure 2). Figure 7: Sample Analysis Graph Research Goals Another example of the types of analyses that we are conducting is shown in Figure 7. This visualization was also created using Tableau, and shows how partcipants use of ‘Broadcasts’ changed over the course of the workshop. Average Number of Broadcasts Per Insturction Day Class A Class B Instruction Day Average Number of Broadcasts

Macro Data for Micro Learning: Developing FUN! For ... · For example Figure 6 shows a visualization of a student’s Scratch project using Tableau. This visualization shows how the

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Macro Data for Micro Learning: Developing FUN! For ... · For example Figure 6 shows a visualization of a student’s Scratch project using Tableau. This visualization shows how the

Macro Data for Micro Learning: Developing FUN! For Automated Assessment of Computational Thinking in Scratch

Taylor Martin & Deborah Fields

Introduction

Methods

Methods Cont.

References

In this exploratory research project funded by an NSF Cyberlearning grant we are seeking to advance the understanding of personalized learning in constructionist technology-enhanced environments by capitalizing on the capability of automated assessments to make that learning visible.

Specifically our research focuses on the programming environment called Scratch. We will be introducing children to Scratch in constructionist workshops in which children themselves set goals that require complex thinking and learning. The challenge arises in making this learning visible to students, teachers, parents, and educational stakeholders.

Currently we are in the process developing the second half of the Fun! Tool which will automatically analyze students projects based on several programming concepts identified to be important to novice programmers.

For example Figure 6 shows a visualization of a student’s Scratch project using Tableau. This visualization shows how the scripts in the project are all connected either using the ‘Green Flag’ (green dot and lines) or ‘Broadcasts’ (orange lines).

Based on the results of our first set or workshops we have made revisions to the Fun! Tool, and we are currently conducting our second set of workshops. These workshops will be finished by the end of July 2014. Afterwards an even more indepth analysis will begin. Results will follow, and final adjustments to the Fun! Tool will be made.

In addition we plan to create a GUI interface for the Fun! Tool that will include a dashboard for other researchers and teachers to use (see Figure 8).

Denner, J., Werner, L., & Ortiz, E. (2011). Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts? Computers and Education, 58, 240–249.

Grover, S., & Pea, R. (2013). Computational Thinking in K–12: A Review of the State of the Field. Educational Researcher, 42(1), 38–43. doi:10.3102/0013189X12463051

Maloney, J. H., Peppler, K., Kafai, Y., Resnick, M., & Rusk, N. (2008). Programming by choice: urban youth learning programming with scratch. ACM SIGCSE Bulletin, 40(1), 367–371.

Resnick, M., Maloney, J., Monroy-Hernández, A., Rusk, N., Eastmond, E., Brennan, K., … others. (2009). Scratch: programming for all. Communications of the ACM, 52(11), 60–67.

Analysis Next Steps

Scratch EnvironmentScratch was created by the MIT Media Lab as an introductory programming environment for children. Scratch was designed to quickly and easily make creative projects without having prior programming knowledge, while at the same time allowing for the use of complex programming concepts such as variables and conditionals (Grover & Pea, 2013). Children choose a character called a Sprite, and then program the sprite using programming commands called blocks that snap together like Legos (Resnick et al, 2009). See Figure 1.

Figure 1: Sample Student Scratch Project

Figure 2: Study Design

Figure 3: Workshop Number 1 Participants

Figure 4: Workshop Instruction Design

Figure 5: Design Challenge Descriptions

Figure 8: Sample DashboardFigure 6: Visualization of a Students

Scratch Project

The first workshop was conducted in the Fall of 2013 (see Figure 3), and resulted in the collection of approximately 11,700 saves of student projects that we are currently analyzing.

During workshop number one participants engaged in a total of 10 hours of introductory programming. Workshops followed a studio arts model of instruction, which included opportunities for personal expression and exploration while providing structure through the use of design challenges (see Figures 4 & 5).

To answer our research questions we are developing a tool called the Fun! Tool. The Fun! Tool has two purposes. First, the Fun! Tool will collect snap shots of children’s Scratch code at regular intervals so that we can see at a more fine grained level their progress over time. Second the Fun! Tool will analyze the code and give a report on how each student is progressing.

To provide a testing ground for the Fun! Tool we are conducting a series of Scratch workshops for novice programmers between grades 5-8 (see Figure 2).

Figure 7: Sample Analysis Graph

Research Goals

Another example of the types of analyses that we are conducting is shown in Figure 7. This visualization was also created using Tableau, and shows how partcipants use of ‘Broadcasts’ changed over the course of the workshop.

Average Number of Broadcasts Per Insturction Day

Class A Class B

Instruction Day

Aver

age

Num

ber o

f Bro

adca

sts