139
The Pennsylvania State University The Graduate School College of Engineering THE DEVELOPMENT OF A GAMIFICATION SYSTEM FOR ENGINEERING LAB ACTIVITIES BASED ON USER CENTERED DESIGN A Dissertation in Industrial Engineering by Eunsik Kim 2018 Eunsik Kim Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy August 2018

THE DEVELOPMENT OF A GAMIFICATION SYSTEM FOR …

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

The Pennsylvania State University

The Graduate School

College of Engineering

THE DEVELOPMENT OF A GAMIFICATION SYSTEM FOR ENGINEERING LAB

ACTIVITIES BASED ON USER CENTERED DESIGN

A Dissertation in

Industrial Engineering

by

Eunsik Kim

2018 Eunsik Kim

Submitted in Partial Fulfillment

of the Requirements

for the Degree of

Doctor of Philosophy

August 2018

The dissertation of Eunsik Kim was reviewed and approved* by the following:

Andris Freivalds

Lucas Professor of Industrial and Manufacturing Engineering

Dissertation Co-Advisor

Co-Chair of Committee

Ling Rothrock

Professor of Industrial and Manufacturing Engineering

Dissertation Co-Advisor

Co-Chair of Committee

Thomas Litzinger

Assistant Dean for Educational Innovation and Accreditation and Director of Leonhard

Center

Conrad Tucker

Associate Professor of Industrial and Manufacturing Engineering

Janis Terpenny

Peter & Angela Dal Pezzo Department Head of Industrial and Manufacturing Engineering

*Signatures are on file in the Graduate School

iii

ABSTRACT

Gamification can be defined as the use of game elements and mechanics as well as game

design techniques in non-game contexts. It is no surprise that in recent years the application of

gamification has been used to encourage people to engage in desired behaviors in business,

marketing, corporate management, and online communities and social networks. Lee and

Hammer have theorized that gamification can also be applied to the education field as a tool to

increase student engagement and motivate students to learn. While numerous studies on

gamification have been conducted to explore its impact on students’ learning, there is little

empirical evidence to support the effectiveness of gamification at motivating and engaging

students. Especially little research has been conducted on the application of gamification to

engineering lab activities.

The overall goal of engineering education is to prepare students to practice engineering

and, in particular, to deal with the forces and materials of nature. As such, lab activity is essential

to an education in engineering. Beyond gaining theoretical knowledge in the classroom, vital

practical knowledge and experience can only be obtained in the lab. Lab activity also improves

teamwork among students, as they must work in groups while dealing with real data and case

studies.

The aim of this study is to develop the effective gamification system for engineering

education applying User-Centered Design process and to investigate the impact of students

personality trait on gamification engagement as well as the relationship between gamification

engagement and each type of students motivation within SDT framework. The specific aims of

this study include: (1) determining the effects of gamification on engineering lab activities in

terms of motivation, engagement, and performance, (2) developing an effective gamification

system for engineering lab activities based on a User-Centered Design (UCD) process, (3)

iv

determining the role of students’ personality traits in the effects of gamification and (4)

Determining the relationship among gamification, each type of motivation (intrinsic, extrinsic and

amotivation), and learning outcomes.

For the first aim, two types of websites were created to collect data from students who

were enrolled in an undergraduate Introduction to Human Factors course taught at The

Pennsylvania State University in the fall semester of 2015. The two types of websites were

Gamification (GM) and Non-Gamification (NG). While the GM website included game elements

such as a Badge System, Score, Avatar, Leaderboard, Level, and Feedback (Notification), the NG

website was a traditional website without game elements. In each of these websites, students

could create their own multiple-choice questions (MCQs) and answer questions authored by their

classmates. The results suggest that the application of gamification as a supporting tool in

engineering lab activities has a positive effect on students’ motivation, engagement, and learning

outcome based on the consistency between students’ performance in and subjective satisfaction

with the gamification system. In addition, the results of frequency analysis indicate that 80% of

students were motivated by ‘Ranking’ and ‘Score,’ and 50% of students experienced fun due to

‘Badges,’ ‘Feedback,’ and ‘Avatar.’ Students chose ‘Ranking’ and ‘Score’ as the game elements

to be retained in the new gamification system.

The second and third aims were focused on the research question of how to develop an

effective gamification system to improve the effectiveness of gamification on students’ learning.

To answer this research question, I conducted an experiment with a total of 105 students by using

two types of gamification systems: Initial Version and User-Centered Designed Version. The

usability test identified a total of 25 unique usability problems across 5 categories: (1) Design, (2)

Navigation, (3) Game Element, (4) Main Activity, and (5) Feedback. Applying the User-Centered

Design process had a positive effect on building an effective gamification system, increasing the

level of engagement for gamification website activity. In addition, a number of relationships were

v

identified between different personality traits in students and (1) perceptions of gamification, (2)

engagement with gamification and, (3) learning outcomes. Our findings suggest that the effects of

gamification vary depending on individual attributes. In addition, I suggest that gamification

developers apply UCD in the development process.

For the last aim, I used structural equation modeling to investigate deeply the relationship

between gamification and student motivation within the framework of self-determination theory.

The results showed that gamification activity had a significant positive influence on intrinsic and

extrinsic motivation and had a significant negative influence on amotivation. I also found that

motivation had a significant positive influence on learning outcome.

The present study is one of the first to cover several aspects still underexplored in current

gamification research. I attempted to empirically evaluate the impact on student motivation of

applying a UCD process to gamification within an SDT framework, which is seldom empirically

studied in gamification literature. Furthermore, this study contributes to current research as the

first empirically validated study to measure results across three repetitions of the experiment. In

addition, this study is also one of the first to empirically find that even reward-based gamification

can increase students’ intrinsic motivation, suggesting that it is possible to change students’

behavior. However, since I did not figure out the effects of individual game elements on students’

motivation, more empirical research is necessary to determine why particular game elements play

a role as extrinsic or intrinsic motivators in a given context, and how this, in turn, influences

students’ behavior. I expect that these results will inform instructors who are interested in

gamifying their courses and will help them in deciding how to develop gamification to use in

their specific context.

vi

TABLE OF CONTENTS

List of Figures ......................................................................................................................... ix

List of Tables ........................................................................................................................... xi

Chapter 1 Introduction ............................................................................................................. 1

1.1 Problem Statement ..................................................................................................... 1 1.2 Study objective ........................................................................................................... 5 1.3 Outline of the thesis ................................................................................................... 6

Chapter 2 Literature Review .................................................................................................... 7

2.1 Gamification ............................................................................................................... 7 2.1.1 Gamification-Related Concepts ...................................................................... 9

2.1.1.1 Serious Games ...................................................................................... 10 2.1.1.2 Persuasive Games ................................................................................. 11

2.1.2 Game elements ................................................................................................ 12 2.1.2.1 Game Components ............................................................................... 15 2.1.2.2 Game Mechanics .................................................................................. 15 2.1.2.3 Game Dynamics ................................................................................... 16

2.1.3 Gamification in Education............................................................................... 17 2.2 User Centered Design ................................................................................................ 21

2.2.1 UCD Principles ............................................................................................... 22 2.2.2 Usability .......................................................................................................... 26

2.3 Motivation .................................................................................................................. 28 2.3.1 Cognitive Evaluation Theory .......................................................................... 29 2.3.2 Self-Determination Theory ............................................................................. 31 2.3.3 Self-Determination Theory for Gamification .................................................. 34

2.4 Personality .................................................................................................................. 36

Chapter 3 The Effects of Gamification on Engineering Lab Activities ................................... 39

3.1 Introduction ................................................................................................................ 39 3.2 Methods ...................................................................................................................... 39

3.2.1 Websites .......................................................................................................... 40 3.2.2 Design of experiment ...................................................................................... 45

3.3 Results ........................................................................................................................ 46 3.4 Discussion .................................................................................................................. 51

Chapter 4 An Empirical Study on the Impact of Lab Gamification on Engineering

Students’ Satisfaction and Learning ................................................................................. 54

4.1 Introduction ................................................................................................................ 54 4.2 Method ....................................................................................................................... 55

4.2.1 Website ............................................................................................................ 55 4.2.2 Participants and procedure .............................................................................. 59

vii

4.2.3 Measurement and Data analysis ...................................................................... 61 4.3 Results ........................................................................................................................ 62

4.3.1 Demographic statistics .................................................................................... 62 4.3.2 Factor analysis ................................................................................................. 63 4.3.3 Descriptive Analysis ....................................................................................... 64 4.3.4 Hypothesis testing ........................................................................................... 66 4.3.5 Additional findings .......................................................................................... 69 4.3.6 Frequency analysis .......................................................................................... 69

4.4 Discussion .................................................................................................................. 71

Chapter 5 Investigating the Impact of Personality Traits and a User-Centered Design

Process in a Gamified Laboratory Context ...................................................................... 75

5.1 Introduction ................................................................................................................ 75 5.2 Pilot Experiment ........................................................................................................ 75

5.2.1 Method ............................................................................................................ 75 5.2.1.1 Participants and procedure ................................................................... 76

5.2.2 Results ............................................................................................................. 77 5.3 Experiment ................................................................................................................. 80

5.3.1 Method ............................................................................................................ 80 5.3.1.1 Participants and procedure ................................................................... 81 5.3.1.2 Measurement ........................................................................................ 83 5.3.1.3 Data analysis ......................................................................................... 85

5.3.2 Results ............................................................................................................. 85 5.3.3 Discussion ....................................................................................................... 89

Chapter 6 Explore the relationship between gamification and motivation through the lens

of self-determination theory (SDT) .................................................................................. 95

6.1 Introduction ................................................................................................................ 95 6.2 Method ....................................................................................................................... 96

6.2.1 Participants and procedure .............................................................................. 96 6.2.2 Measurement ................................................................................................... 97

6.2.2.1 Gamification ......................................................................................... 97 6.2.2.2 Questionnaire for students’ motivation. ............................................... 97 6.2.2.3 Learning outcome ................................................................................. 98

6.2.3 Data analysis ................................................................................................... 99 6.3 Results ........................................................................................................................ 99

6.3.1 Structured model evaluation ............................................................................ 101 6.4 Discussion .................................................................................................................. 103

Chapter 7 Conclusion ............................................................................................................... 106

Appendix A The first questionnaire for gamification group ................................................... 110

Appendix B The second questionnaire for gamification group .............................................. 112

Appendix C The questionnaire for non-gamification group ................................................... 113

viii

References ................................................................................................................................114

ix

LIST OF FIGURES

Figure 2-1 Gartner Hype Cycle for 2011 ................................................................................. 7

Figure 2-2 Example of Foursquare .......................................................................................... 9

Figure 2-3 Gamification and Related Concepts ....................................................................... 10

Figure 2-4 The Game Element Hierarchy (adapted from Werbach & Hunter, 2012) ............. 13

Figure 2-5 Work distribution by year of publication (*: 1st Quart of Year) ............................. 18

Figure 2-6 Work distribution by subject (adapted from Dicheva et al., 2015) ........................ 19

Figure 2-7 The gulfs and bridges of execution and evaluation ................................................ 24

Figure 2-8 The spectrum of motivation according to SDT (Adapted from Ryan RM, Deci

EL. 2000.) ........................................................................................................................ 32

Figure 3-1 The main menus of the two websites: (A) Gamification (B) Non-Gamification ... 42

Figure 3-2 Samples of the (A) “View my badges” and (B) “Leaderboard” pages .................. 43

Figure 3-3 Timeline of the experiment .................................................................................... 46

Figure 3-4 The difference of exam scores between the two groups for both phases ............... 49

Figure 3-5 The difference of the number of questions between two groups (GM to NG,

NG to GM) ....................................................................................................................... 50

Figure 3-6 The difference of the number of answers between two groups (GM to NG, NG

to GM) .............................................................................................................................. 50

Figure 3-7 The difference of the number of distinct days between two groups (GM to NG,

NG to GM) ....................................................................................................................... 51

Figure 4-1 the main page of the two websites: (A) Gamification (B) Non-gamification ........ 57

Figure 4-2 Example of evaluation page in website .................................................................. 58

Figure 4-3 Timeline of the experiment .................................................................................... 61

Figure 4-4 Results of frequency analysis for short answer question ....................................... 70

Figure 5-1 Example pages from the pre-modification gamification system: (a) main page,

(b) question list page, and (c) a question response page .................................................. 79

x

Figure 5-2 Example of pages from the post-modification gamification system: (a) main

page 1, (b) main page 2, (c) question list page, (d) check answers page, (e) a

question response page, and (f) a question evaluation page. ........................................... 80

Figure 5-3 Results of students’ preferences between IV and UCD systems ............................ 87

Figure 6-1 Structural equation model depicting relationship between gamification,

motivation, and performance ........................................................................................... 103

xi

LIST OF TABLES

Table 2-1 Categories of game elements based on Zichermann and Cunningham (2011) ........ 13

Table 2-2 Levels of game design elements .............................................................................. 14

Table 2-3 Game Mechanics Descriptions ................................................................................ 16

Table 2-4 Key principles for UCD (adapted from Gulliksen et al., 2003) ............................... 25

Table 3-1 Examples of the badges with descriptions ............................................................... 44

Table 3-2 The number of students who joined each type of website for each phase ............... 46

Table 3-3 The summary of website activities between two groups for both phases ................ 48

Table 4-1 Summary of extra credit for participating in this study ........................................... 60

Table 4-2 Results of factor analysis from questionnaire .......................................................... 64

Table 4-3 Descriptive statistics for questionnaire response ..................................................... 65

Table 4-4 Descriptive statistics for gamification engagement ................................................. 65

Table 4-5 Descriptive statistics for learning outcomes ............................................................ 65

Table 4-6 ANOVA analysis of active learning for gamification engagement and learning

outcomes .......................................................................................................................... 66

Table 4-7 ANOVA analysis of Motivation for gamification engagement and learning

outcomes .......................................................................................................................... 67

Table 4-8 ANOVA analysis of Game Element for gamification engagement and learning

outcomes .......................................................................................................................... 68

Table 4-9 Correlation results among gamification performance, the questionnaire results,

and learning outcomes (n=86) .......................................................................................... 68

Table 4-10 Results of two-tailed paired t-test .......................................................................... 69

Table 4-11 The results of frequency analysis for open ended question ................................... 71

Table 5-1 Summary of usability problems ............................................................................... 78

Table 5-2 Summary of extra credit for participating in this study ........................................... 82

Table 5-3 Score algorithm in gamification systems ................................................................. 84

Table 5-4 The summary of gamification activities between two groups for both phases ........ 86

xii

Table 5-5 Correlation results among gamification engagement, learning outcomes, and

students’ personality traits (n=62) .................................................................................... 88

Table 5-6 Correlation results between the questionnaire results and students’ personality

traits .................................................................................................................................. 89

Table 5-7 Stepwise regression analysis of gamification engagement against the Big Five

factors ............................................................................................................................... 93

Table 6-1 Summary of extra credit for participating in this study ........................................... 96

Table 6-2 Score algorithm in gamification systems ................................................................. 97

Table 6-3 Students’ exam scores of two semester ................................................................... 100

Table 6-4 The summary of gamification activities between two groups for both phases ........ 100

Table 6-5 Mean and standard deviation for AM, CM and amotivation ................................... 101

Table 6-6 Reliability testing ..................................................................................................... 101

Table 6-7 The correlations between the different variables ..................................................... 102

1

Chapter 1

Introduction

1.1 Problem Statement

The exponential growth of digital technologies not only enables great progress in the

information technology industry, but also affects innovation in an educational learning

environment. Nowadays, most students in college and university have grown up with technology

and the internet. Today’s students, known as “Generation Z,” are considered “dependent upon

technology” (Grail Research, 2011). It is not surprising that, instead of a notebook and pencil,

students use portable electronic devices such as laptops, smartphones, or tablets in class or in

their daily life for social, entertainment, and educational activities. According to the most recent

study by the Educause Center for Analysis and Research (ECAR) on the use of information

technology by undergraduates from 12 countries, 96% of students have and use their own laptop

in class and reported that laptops were very to extremely important for their academic success

(Brooks, 2016).

However, the use of laptops in classrooms is still controversial in educational practice.

On the one hand, a number of studies have identified that in-class laptop use has a positive

impact on student learning in terms of note-taking (Awwad & Ayesh, 2013; Gaudreau, Miranda,

& Gareau, 2014; R. H. Kay & Lauricella, 2011; Lindroth & Bergquist, 2010; Skolnik & Puzo,

2008), academic activities (Barak, Lipson, & Lerman, 2006; Skolnik & Puzo, 2008), access to

resources (Skolnik & Puzo, 2008), and communication (R. H. Kay & Lauricella, 2011;

Kraushaar & Novak, 2010; Lindroth & Bergquist, 2010). Conversely, several research studies

have shown that the usage of laptops facilitates distracting activities including surfing the web,

sending instant messages and emails, playing games, watching movies, and social networking

2

(Fried, 2008; Hembrooke & Gay, 2003; R. H. Kay & Lauricella, 2011; R. Kay & Lauricella,

2016). Despite the differences in their findings, the majority of the authors of these previous

studies shared the opinion that laptops need to be integrated into students’ educational

environment in order to prompt the academic use of laptops. Furthermore, higher education

institutions in North America have been encouraging instructors to incorporate various

technologies into their teaching in order to facilitate active learning (Altbach & Knight, 2007;

Hooker, 1997) and more student-centered learning (Gandell, Weston, Finkelstein, & Winer,

2000). Allen and Tanner (2005) define active learning as “seeking new information, organizing

it in a way that is meaningful, and having the chance to explain it to others,” and this

pedagogical approach has received considerable attention from engineering educators over the

past several decades (Deslauriers, Schelew, & Wieman, 2011; Johnson, Johnson, & Smith, 1998;

National Academy of Engineers, 2005). Learning-enhancing pedagogy is different from

traditional pedagogy in that it focuses more on developing students’ skills than on transmitting

information. In addition, an active learning strategy requires students to do something other than

simply listen passively by providing more and faster feedback between students and instructors.

Active learning methods have consistently been shown to increase student performance by

motivating them to attend and participate in science, technology, engineering, and mathematics

classes (Freeman et al., 2014; Yuretich, Khan, Leckie, & Clement, 2001).

“Learning by teaching,” in which students perform teaching-related activities such as

explaining and questioning, is a successful teaching strategy that better engages students in the

active learning process (Chang, Wu, Weng, & Sung, 2012; Umetsu, Hirashima, & Takeuchi,

2002; Fu‐Yun Yu, Liu, & Chan, 2005). Creating questions requires students to use higher-order

cognitive skills that help them achieve a higher level of knowledge. This activity is required not

only to help them comprehend content, but also to help them construct, organize, connect, and

interact with contents and key concepts. However, Yu et al. (2005) points out that it is difficult

3

for students to generate question by themselves. They suggest providing supporting mechanisms

to sustain students’ motivation. It is in this context that the emerging approach of gamification,

which aims to improve the overall motivation and engagement of students, enables students to

“learn by teaching” (Hamari, Koivisto, & Sarsa, 2014; Reiners, Hebbel-Seeger, Reiners, &

Schäffer, 2014).

Gamification can be a promising tool in the effort to address the needs of Generation Z,

which prefers multiple streams of information and collaborative experiences. Previous research

has found Generation Z to be smarter and more self-directed than other generations but less able

to pay constant attention (Ding, Guan, & Yu, 2017; Igel & Urquhart, 2012). Gamification

therefore has received widespread attention in education as a new, adaptive learning method.

Because most gamification systems have been developed in Web 2.0 environments, this

approach is well suited to promote academic use of laptops among students in Generation Z.

In education fields, gamification is defined as a pedagogy that motivates students and

increases engagement by using game elements to facilitate learning and fun. Since 2012, several

studies have been conducted to investigate the effect of gamification on students’ motivation,

engagement, and learning outcomes through both empirical and review methods (Barata, Gama,

Jorge, & Gonçalves, 2013; de Freitas & de Freitas, 2013; de Sousa Borges, Durelli, Macedo

Reis, & Isotani, 2014; Hamari et al., 2014; Iosup & Epema, 2014; Lee & Hammer, 2011;

Lounis, Pramatari, & Theotokis, 2014; Malamed, 2012; Wood & Reiners, 2012). Although

substantial research has followed this growing academic interest in gamification, and although

this research attends to the effectiveness of the gamification approach in comparison with more

traditional pedagogies, results among gamification studies have been inconsistent, leading to

widespread agreement in the literature on the need for further empirical research (de Sousa

Borges et al., 2014; Dicheva, Dichev, Agre, & Angelova, 2015; Hamari et al., 2014).

4

Most traditional gamification studies have ignored important variables like the

personality of students, potentially leading to these mixed results. It is well known that

personality contributes to the learning style and motivation of students, which are both essential

for achieving student performance and learning outcomes (Busato, Prins, Elshout, & Hamaker,

1998; Clark & Schroth, 2010; Frunham, 1996; Komarraju, Karau, & Schmeck, 2009; Ö nder,

Beşoluk, İskender, Masal, & Demirhan, 2014; Zhang, 2003). However, it is not yet known how

personality traits impact the efficacy of gamification. Research into personality’s impact on

gamification may inform its effective utilization and lead to further development in this area of

research. Understanding the needs of users/students should be the first step in answering the

research question of how to develop effective gamification systems. The User-Centered Design

(UCD) process provides the necessary framework for ensuring interaction between

users/students and designers by putting the users/students and their goals at the center of the

design and development gamification process. Even though Nicholson (2015) has suggested that

applying the UCD process to gamification is the key to developing successful gamification

systems, there are no empirical studies in the literature that confirm this.

Furthermore, the key features of most gamification systems are points, levels, and

leaderboards (Hamari et al., 2014; Seaborn & Fels, 2015). However, previous research in

psychology provides ample evidence that certain forms of rewards, feedback, and other external

events can have detrimental effects on intrinsic motivation and hence deter students from the

desired behavior. Such results suggest the need for more research on the effectiveness of

gamification aspects when it comes to the augmentation of long-term student motivation.

Because intrinsic motivation is essential to continuously successful learning behavior, it is

necessary to investigate the effect of gamification on students’ intrinsic motivation. However,

currently very few studies have attempted to empirically investigate the impact of gamification

on each type of motivation (internal, external, and amotivation) within an SDT framework.

5

1.2 Study objective

In this study, I focused on developing the gamification framework based on User-

Centered Design and understanding gamification’s impact on student’s motivation, engagement

and learning outcomes. Furthermore, I also focused on the investigating how student’s

personality trait played a role in the gamification engagement as well as how the gamification

has an impact on students each type of motivation including intrinsic, extrinsic and amotivation

using structural equation model. The overall goal of this study was to improve the learning

environment, enabling students to improve performance by increasing their motivation and

engagement. In this study, I developed a gamification system in which students could create

their own questions and answer the questions created by other students. I used an undergraduate

introductory human factors course (IE327), a first-level junior course required for all

baccalaureate students in the Department of Industrial and Manufacturing Engineering at The

Pennsylvania State University. I collected data in various phases of the study across 3 semesters

from Fall 2015 to Fall 2017. The four detailed objectives of the study were as follows:

• Objective 1 - Investigate the effects of lab activity gamification on students’ motivation,

engagement, and learning outcome based on students’ performance and students’

perspective.

• Objective 2 - Investigate the effect of the application of the User-Centered Design process

on the development of an effective gamification system

• Objective 3 - Explore the role of students’ personality traits in the effects of gamification

in terms of motivation, engagement, and learning outcome.

• Objective 4 - Determine the relationship among gamification, each type of motivation

(intrinsic, extrinsic and amotivation), and learning outcomes.

6

1.3 Outline of the thesis

The overall goal of the studies described in this thesis was to investigate the application

of User-Centered Design to the development of gamification designed to increase the effect of

gamification on students performing engineering lab activities. In chapter 2, I review

gamification, User-Centered Design, and motivation as well as personality. In chapter 3, I focus

on the effects of the gamification of lab activities in engineering coursework by comparing

student outcomes from a gamification system that I developed with those from a non-

gamification system in terms of motivation, engagement, and performance. I further investigate

the effects of lab activity gamification based on students’ performance and students’ perspective

(Chapter 4). In Chapter 5, I explore how to build an effective gamification system by applying

the User-Centered Design process and consider the role of students’ personality traits in the

effects of gamification on motivation, engagement, and learning outcomes. Finally, I use

structural equation modeling to investigate deeply the relationship between gamification and

student motivation within the framework of self-determination theory (Chapter 6).

7

Chapter 2

Literature Review

2.1 Gamification

Gamification can be defined as the use of game elements and mechanics as well as game

design techniques in non-game contexts to improve user experience and user engagement,

loyalty, and fun (Deterding, Dixon, Khaled, & Nacke, 2011; Lee & Hammer, 2011; Muntean,

2011). Gamification is a quite recent concept on the market as well as in research, but its

popularity has been growing rapidly. Gamification was added to the Gartner Hype Cycle for

2011, in which they predicted gamification would be a key trend, as shown in Figure 2-1

(Goasduff & Pettey, 2011). In fact, the Gartner Hype Cycle for 2011 predicted that by 2014,

over 70% of Fortune Global 2000 organizations would have adopted gamification in some way.

Figure 2-1 Gartner Hype Cycle for 2011

8

It is no surprise that in recent years the application of gamification has been used to

achieve higher levels of engagement, change behaviors, and stimulate innovation (Liyakasa,

2013). Some companies are already using gamification, such as Microsoft, Nike, SAP, American

Express, MLB, Salesforce.com, AXA, Deloitte, Samsung, Foursquare, USA Networks, LiveOps,

Dell, Foot Locker, eBay, Cisco, Siemens, and Yelp. These companies use gamification for non-

monetary incentive strategies and innovation by engaging employees to submit creative ideas or

solutions. Similarly, companies use gamification for marketing by altering customer behavior to

encourage them to purchase products or visit their website.

Microsoft’s productivity game, Communicate Hope, is one of the best examples of the

usage of gamification to increase internal productivity (R. Smith, 2011). Communicate Hope

motivated thousands of employees to participate this particular game to complete beta feedback

tasks and in this way earn achievement points. In this program, the participants were able to

collect points by providing feedback on usability, by submitting bugs, or by submitting user

feedback. As a result, the users who opted to play the game submitted sixteen times more

feedback than those users who did not play. Furthermore, 97% of the participants responded that

they would participate in another beta program, where these numbers were between 50%-70%

before.

Current location-sharing services like Foursquare are a prominent example of applying

gamification for marketing, as shown in Figure 2-2. Foursquare proved that simple game

mechanics can affect behavior that can engage 10 million customers and be a successful

business model (Rimon, 2014). Foursquare employs gamification elements like points, badges,

levels, and leaderboards to motivate people to engage more with the service and ‘check in’ more

frequently. In addition, Foursquare also uses the reward system, “Mayorships,” which consists

of virtual rewards that can be converted into real products.

9

Figure 2-2 Example of Foursquare

This rapid development of gamification has caught the interest of researchers in non-

business contexts such as health (deWinter & Kocurek, 2014; Lister, West, Cannon, Sax, &

Brodegard, 2014; Wylie, 2010), interactive systems (Flatla, Gutwin, Nacke, Bateman, &

Mandryk, 2011), education (Lee & Hammer, 2011; Raban & Geifman, 2009; Rafaeli, Raban,

Ravid, & Noy, 2003; Ravid & Rafaeli, 2000), and sustainability (Berengueres, Alsuwairi, Zaki,

& Ng, 2013; Gnauk, Dannecker, & Hahmann, 2012), as well as in online communities and

social networks (Bista, Nepal, Colineau, & Paris, 2012; Thom, Millen, & DiMicco, 2012).

2.1.1 Gamification-Related Concepts

It is important not to confuse other game-related applications such as serious games and

playful interaction with gamification. Thus in order to clarify these differences, we regard the

necessity of starting this section by defining these similar concepts. An overview of the relations

between them is shown in Figure 2-3.

10

Figure 2-3 Gamification and Related Concepts

2.1.1.1 Serious Games

Serious games is a term that refers to digital games that are used for a primary purposes

other than pure entertainment (Susi, Johannesson, & Backlund, 2007). Another more precise

definition proposed by (Ritterfeld, Cody, & Vorderer, 2009) is that serious games are “Any form

of interactive computer-based game software for one or multiple players to be used on any

platform and that has been developed with the intention to be more than entertainment.” In this

book, they divided serious games into several categories, such as Game-Based Simulations,

Game-Based Models, Game-Based Visualizations, Game-Based Interfaces, Game-Based

Productions, Game-Based Messaging/Advertising/Marketing, Game-Based Training, and Game-

Based Education/Learning. For example, as one of the key categories of serious games, Game-

Based Education/Learning can be defined as using gameplay to enhance motivation to learn,

engage education, or to enhance the effectiveness of content transfer or other specific learning

outcomes.

11

In terms of trying to solve problems with game thinking, the difference between

gamification and serious games is not very clear (Wu et al., 2012). Kapp (2014) also considered

serious games as a subset of gamification, describing serious games as one way in which

gamification manifests. However, Deterding et al. (2011) distinguished gamification from

serious games by defining serious games as those that are wholly games and do not include any

non-game components, while gamified applications will contain a mixture of game elements and

non-game elements. For example, the main goal for players in serious games is to achieve the

game goals by completing all the stages or being first in the ranking. The sequence of activities

and players’ actions leads to achieving these goals, and learning is produced as a side effect

(Silva et al., 2013). However, in gamification, the main goal is not to achieve these game goals

but to properly perform the learning activities. In order to reach that goal, game design elements

are included in learning activities. Thus, one of the main differences between gamified learning

situations and educational games is the intentionality of the game design elements within the

game or the activities.

2.1.1.2 Persuasive Games

To our knowledge, the distinction between gamification and persuasive games is not one

of difference but complementariness. Persuasive games can be defined as games that

are designed to change users’ attitudes or behaviors through persuasion and social influence but

not through coercion (Fogg, 2002). Additionally, Fogg (2002) identified seven common

elements contained in persuasive games, as follows: conditioning, reduction, self-monitoring,

suggestion, surveillance, tailoring, and tunnelling. All seven of these elements are related to

gamification. For example, tunnelling is a way of leading users through a sequence of events and

12

is used in gamification as levels, goals, and progression. Surveillance is also used in

gamification for key elements such as points and rewards.

2.1.2 Game elements

Yohannis et al., (2014) defined game elements as the elements that support the presence

of gameful characteristics, similar to Deterding’s (2011) definition, which is “elements that are

characteristic to games.” There is no single widely accepted classification of game elements. For

example, the popular game element “badges” is categorized into a game mechanic in

(Zichermann & Cunningham, 2011), a game interface design pattern in (Deterding et al., 2011),

and a game dynamic in (Iosup & Epema, 2014). However, all previous researches classified

game elements on several levels of abstraction from concrete elements, including badges and

leader boards, to more abstract elements, such as time constraints and styles of games. For

example, Zichermann and Cunningham (2011) categorized game elements into mechanics,

dynamics, and aesthetics (MDA) frameworks, as shown in Table 2-1. The MDA framework

helps in conceptualizing the relationship between the designer and user’s perspectives. From the

designer’s perspective, game mechanics (like points, controls, and levels) are used to achieve a

particular aesthetic (like a challenge or fellowship), whereas the user will first experience the

aesthetics and then start to unravel the mechanics through game dynamics (like time pressure

and sharing information).

13

Table 2-1 Categories of game elements based on Zichermann and Cunningham (2011)

Category Description Example

Mechanics the particular components of the game at the

level of data representation and algorithms

Achievements, avatars,

badges, levels, points, teams

Dynamics the run-time behavior of the mechanics acting on

player inputs and each other’s’ outputs over time

Challenges, competition,

cooperation, feedback,

rewards

Esthetics Desirable emotional responses evoked in the player

when she interacts with the game system

Constraints, emotions,

narrative, progression,

relationships

Similarly, Werbach and Hunter (2012) developed another formal approach, called the

Game Element Hierarchy, to conceptualize three categories of elements, which they termed

dynamics, mechanics, and components, in order of decreasing abstraction, as shown in Figure

2-4. Although the point of view and the names assigned to each level of element are different

between the two frameworks, the Game Element Hierarchy can be considered equivalent to the

MDA as follows: (1) The Components correspond to the mechanics of the MDA framework; (2)

The Mechanics correspond to the dynamics of the MDA framework; (3) The Dynamics

correspond to the aesthetics of the MDA framework.

Figure 2-4 The Game Element Hierarchy (adapted from Werbach & Hunter, 2012)

14

An alternative perspective developed by Deterding et al. (2011) is shown in Table 2-2.

They divided game elements into five categories: (1) interface design patterns; (2) game design

patterns or game mechanics; (3) design principles or heuristics; (4) conceptual models of game

design units; and (5) game design methods and design processes. Dicheva, Irwin, Dichev, and

Talasila (2014) also developed a two-level framework based on a framework developed by

Deterding et al. (2011). They combined the first two levels of Deterding’s classification into one

level and called it “game mechanics.” Levels (3) and (4) of Deterding’s classification were also

combined into one level and called “educational design principles.” Game mechanics included

elements such as points, badges, or progress bars, and educational design principles included

elements such as accrual grading, feedback, or students’ freedom-to-choose.

Table 2-2 Levels of game design elements

Level Description Example

Game interface design

patterns

Common, successful interaction design

components and design solutions for a

known problem in a context, including

prototypical implementations

Badge, leaderboard, level

Game design patterns

and mechanics

Commonly reoccurring parts of the design of

a game that concern gameplay.

Time constraint, limited

resources, Turns

Game design principles

and heuristics

Evaluative guidelines to approach a design

problem or analyze a given design solution

Enduring play, clear goals,

variety of game styles

Game models Conceptual models of the components of

games or game experience

MDA; challenge, fantasy,

curiosity; game design atoms;

CEGE

Game design methods Game design-specific practices and

processes

Playtesting, playcentric

design, value conscious game

design

Given these perspectives on game elements, the commonality is that they are classified or

categorized based on levels of abstractions: the MDA framework and Game Element Hierarchy

progress from the abstract to the concrete, while the levels of game design elements progress

from the concrete to the abstract.

15

2.1.2.1 Game Components

Game components are the basic parts of the game world manipulated by the players or

the system, and they are probably the most concrete category of game elements. These

components are known as “game interface design patterns” according to Deterding’s Framework

and “game mechanics” according to the MDA Framework. These components include

achievements, avatars, badges, collections, combat, content, unlocking, gifting, leaderboards,

levels, points, quests, social graphs, teams, and virtual goods. Lower-level components may

implement more than one aspect of higher-level mechanics and dynamics because they are the

specific instantiations of game dynamics and mechanics. For example, some mechanics such as

challenges and feedback are used to convey a sense of progression (game dynamics). In turn,

these mechanics can be implemented by deploying within-game components such as points,

levels, and badges. It should be noted that a given element can be used to represent more than

one mechanic or dynamic within the game. For example, points can be used as a representation

of feedback and progression, as well as of relationships and competition when combined with a

leaderboard.

2.1.2.2 Game Mechanics

Game mechanics describe the specific components that are responsible for the function

of the game. They drive users to engage with the content and continue to drive the action

forward. Game mechanics are comprised of challenges, chances, competition, cooperation,

feedback, resource acquisition, rewards, transactions, turns, and win states. For example,

feedback is one of the most important aspects to motivate players in gamification systems.

Feedback can provoke increased intrinsic motivation and autonomy by providing unexpected,

16

informal feedback or reinforcement about the progress of the player. Feedback can also generate

behavior changes when metrics are provided through the given feedback (Werbach & Hunter,

2012). Table 2-3 is a list that contains short descriptions of the most important game mechanics.

Table 2-3 Game Mechanics Descriptions

Game Mechanics Descriptions

Reward Benefits a player gets for completing some action or reaching some

achievement

Status

A means of recognition, fame, prestige, attention and, ultimately, the

esteem

and respect of others

Competition Activities in which players as teams go up against each other,

resulting in one or more winners

Cooperation Collaborating in a team to reach a common goal

Feedback Returning information to players and informing them of where they

are at the present time

Challenges Giving players direction for what to do within the gamification system

Chances The element(s) of randomness in a game

2.1.2.3 Game Dynamics

Game dynamics is the structure of the game in which players can interact with game

mechanics and components. These elements reveal the underlying forces, such as constraints,

emotion, narrative, and progression that are present in almost all games. For example,

constraints are present in every game, as the game limits players' freedom in order to create

meaningful choices and interesting problems. Games can also create many different kinds of

emotions, from joy to sadness. The emotion of fun is especially important in games, as the

experience and joy of accomplishment is the emotional reinforcement that keeps people playing.

Narrative is the structure that makes the game into a coherent whole, although it does not have to

be explicit, like a storyline in a game, to work; it can also be implicit, where the whole

experience simply feels like it has a purpose and is not just a collection of nice ideas. Lastly,

progression is the idea of giving players the feeling of moving forward and improving so that

17

instead of doing the same thing over and over again, they are able to progress (Werbach &

Hunter, 2012).

2.1.3 Gamification in Education

Gamification has become a commonly recommended pedagogical tool (Anderson &

Rainie, 2012, Boulet, 2012, Chou, 2013 and Kapp, 2012), and several game-design mechanics

have been demonstrated successfully in educational environments (de-Marcos et al., 2014 and

Stott and Neustaedter, 2013). Schools already have several game-like elements such as points

(grades), level (academic year), feedback (comments from teachers), and competition (ranking).

However, the default environment of school often results in undesirable outcomes such as

disengagement, cheating, and dropping out. There are several reasons for these situations. For

example, although educational settings provide feedback to students, it is not immediate and

frequent feedback: teachers can often only evaluate and provide feedback to one student at a

time, and feedback via grading takes time. Additionally, teachers typically present information

to their classes in categories that scale by difficulty, a process known as scaffolded instruction,

but it can be difficult within this structure to accommodate each individual student's needs. Thus

the existence of game-like elements in school does not translate directly to engagement. To solve

this problem, Lee and Hammer (2011) theorized that gamification could also be applied to the

education field as a tool to increase student engagement and drive desirable learning behaviors.

Further, they suggested that the most important thing is to understand under what circumstances

game elements can drive learning behavior. For this, understanding gamification’s impact on the

emotional and social areas of students is the key to engage students. Therefore, Lee and Hammer

(2011) expected that gamification can change the rules, students’ emotional experiences, and

students’ sense of identity and social positioning. Several researchers have also suggested

18

applying gamification to increase student motivation by providing students with clear,

achievable goals (Landers & Callan, 2011), by providing immediate and frequent feedback

(Kapp, 2012), by making a narrative context around a task (Clark & Rossiter, 2008), by

encouraging competition (Hamari, 2013), and by showing visual display of progress (Camilleri,

Busuttil, & Montebello, 2011; Kapp, 2012).

Beyond these studies considering gamification’s potential, numerous studies on

gamification have been conducted to explore its impact on students’ learning, and the number of

such studies has been increasing rapidly, as shown in Figure 2-5. Furthermore, based on the

systematic review conducted by Bodnar, Anastasio, Enszer, and Burkey (2016), 191 papers

regarding applying gamification to undergraduate engineering classroom have been published

from 2000 to 2014 and more than half of 191 papers were published since 2010. This suggests

that gamification is becoming a more popular subject for academic inquiry.

Figure 2-5 Work distribution by year of publication (*: 1st Quart of Year)

According to the study conducted by Dicheva et al. (2015), a majority of gamification

studies in education were conducted in the context of in-class higher education sites and for

developing gamification support platforms, as shown in Figure 2-6. Dicheva et al. (2015) also

grouped the context of gamification into the following categories: gamifying courses without

online gamification support, gamifying Massive Open Online Courses (MOOCS), gamifying

19

blended learning courses, gamifying e-learning sites, and developing gamification support

platforms.

Figure 2-6 Work distribution by subject (adapted from Dicheva et al., 2015)

Denny (2013), for example, investigated how badges could be used to motivate students

via online gamification support platforms. Results showed that the presence of badges had a

clear positive effect on both the number of answers submitted and the duration of engagement,

yet it had no effect on the number of questions authored by students or the perceived quality of

the learning environment. Muntean (2011) made a theoretical analysis of gamification as a tool

to increase engagement in e-learning courses. Although the author did not conduct the empirical

research, he suggested that gamification mechanics can be used to motivate and trigger desired

behaviors on students based on Fogg’s Behavior Model. He also provided a list of gamification

elements, explaining how they could be included in an e-learning course. Gåsland (2011)

conducted empirical research to answer a research question about how motivation using aspects

of game mechanics such as progression and points can be used for e-learning courses. Results

showed that the gamified e-learning system on average was considered to be somewhat

motivating. She suggested, however, that much more research is needed to test other

gamification mechanisms and their combinations.

20

Burkey, Anastasio, and Suresh (2013) applied gamification to a traditional chemical

engineering laboratory course. They used game elements such as guilds, points, levels, and

reputation to motivate students to perform extra learning tasks under a collaborative team-based

game context. In addition, some studies that applied gamification to traditional courses revealed

improvements in teamwork (Caton & Greenhill, 2013; Mitchell, Danino, & May, 2013) and

learning outcomes (Akpolat & Slany, 2014).

Despite these positive outcomes, a majority of the prior research has found positive

effects to exist only in parts of the considered relationships between the gamification elements

and studied outcomes (Attali & Arieli-Attali, 2015; Coetzee, Fox, Hearst, & Hartmann, 2014;

Denny, 2013; Domínguez et al., 2013; Li, Grossman, & Fitzmaurice, 2012; A.-L. Smith &

Baker, 2011). For example, McDaniel, Lindgren, and Friskics (2012) showed that applying

gamification had positive motivational effects on students but at the same time caused frustration

probably due to difficulties to get the achievements. Domínguez et al. (2013) indicated that

gamification, especially its reward systems and competitive social mechanisms, motivated

students emotionally and socially, but the students in the gamification group showed decreased

class participation and poorer performance on exams compared with students in the non-

gamification group.

It remains unclear what effect extrinsic game mechanics have on intrinsic motivation

and how exactly they affect motivation, either positively or negatively (Bielik, 2012). For

example, Haaranen et al. (2014) showed that gamification implemented with badges did not

affect students’ learning outcomes. Furthermore, they reported that some students had strongly

negative emotions towards badges. In contrast, a recent study of badge systems suggested that

negative aspects are mostly attributable to poor design (Antin & Churchill, 2011; Bielik, 2012).

The majority of these authors shared the opinion that gamification has the potential to improve

learning if it is well designed and used correctly. Nicholson (2012) has suggested that the UCD

21

process is key to developing successful gamification systems, and it is possible to avoid

Gartner’s prediction of an 80% failure rate for gamified applications by implementing early

usability testing. However, no studies have been conducted on developing gamification systems

using UCD. The current study therefore examines the effects of the UCD process on

gamification’s usability and, more importantly, enjoyability.

2.2 User Centered Design

Software designers usually try to improve their software to make it easy for people to

use by involving users more actively in the design process, an approach that is known as user

centered design (UCD). Norman and Draper (1986) originally used the term UCD in a book

entitled User Centered System Design: New Perspectives on Human-Computer Interaction. In

this book, they emphasized the importance of having a good understanding of the users, though

without necessarily involving users actively in the design process. Norman also defined UCD as

“a philosophy based on the needs and interests of the user, with an emphasis on making products

usable and understandable”(Donald Norman, 1988). He provided four suggestions on how a

design should:

1. Make it easy to determine what actions are possible at any moment.

2. Make things visible, including the conceptual model of the system, the alternative action

s, and the results of actions.

3. Make it easy to evaluate the current state of the system.

4. Follow natural mappings between intentions and the required actions; between actions a

nd the resulting effect; and between the information that is visible and the interpretation

of the system state (D. Norman, 1988, p.188).

22

These suggestions are intended to make sure that (1) the user can figure out what to do,

and (2) the user can tell what is going on. In the 1990s, although few systems designers and

developers had knowledge of the basic principles of UCD, it had been proposed by a number of

methods, methodology books, and researches and had been shown to have applications in

various domains including designing websites, products, and documents. As a result, most

designers today have some knowledge of—or at least exposure to—UCD practices, whether they

are aware of them or not. Jeffrey and Chisnell (1994) described UCD as techniques, processes,

methods, and procedures for designing usable products and systems with the user at the center of

the process. The difference between Norman and Jeffrey in terms of their definitions of UCD is

whether the user is included in the design process. Pancake (1998) also emphasized that user

involvement is essential throughout the design process, since different types of usability

problems will be caught and corrected at different points. Gulliksen et al. (2003) emphasized

usability issues and user involvement throughout the entire design process

The increasing attention on UCD led to the development of the ISO 9241-210 (2009)—

formerly ISO 13407— standard on human-centered design for interactive systems. ISO 9241-

210 (2009) states that "User centered design is an approach to systems design and development

that aims to make interactive systems more usable by focusing on the use of the system and

applying human factors/ergonomics and usability knowledge and techniques."

2.2.1 UCD Principles

UCD principles are a set of rules intended to help designers make design decisions.

These principles form the design of a specific component of interaction and have been

developed after extensive studies. Rules for UCD follow many different guidelines. For

example, Norman (1988) summarized the main principles of UCD as follows:

23

1. Use both knowledge in the world and knowledge in the head.

2. Simplify the structure of tasks.

3. Make things visible: bridge the gulfs of Execution and Evaluation.

4. Get the mappings right.

5. Exploit the power of constraints, both natural and artificial.

6. Design for error.

7. When all else fails, standardize.

These principles are based on the theory of action and interaction, which is described as

a continuum of stages on the action/execution side and the perception/evaluation side of human-

computer interactions. Norman’s (1998) theory included seven stages of activities, as follows:

one for goals (Establishing the goal), three for execution (Forming the intention, Specifying the

action sequence, and Executing the action), and three for evaluation (Perceiving the system state,

Interpreting the state, and Evaluating the system state with respect to the goals and intentions).

Norman (1998) described how the action cycle can be used as a checklist for design so as to

avoid gulfs in execution and evaluation. Figure 2-7 shows the gulfs and bridges of execution and

evaluation.

24

Figure 2-7 The gulfs and bridges of execution and evaluation

The gulf of execution is the difference between the user’s formulation of the actions to

reach the goal and the actions allowed by the system. This is bridged by having a sequence of

four steps: 1) Forming the intention; 2) Planning the sequence of actions; 3) Contacting with the

user interface; and 4) Interaction with the physical system. The gulf of evaluation is the distance

between the physical presentation of the system state and the expectations of the user. This is

bridged by having a sequence of four steps: 1) Displaying the output of the current state; 2)

Interface display; 3) Interpretation; and 4) Evaluation. The goal in both cases is to minimize

cognitive effort.

Gulliksen et al. (2003) provided 9 key principles for USD based on existing theory and

validated their principle by applying and evaluating it in a case study. Key principles for USD

with their descriptions are shown in Table 2-4.

25

Table 2-4 Key principles for UCD (adapted from Gulliksen et al., 2003)

Key principles Description

User focus The user’s goals, tasks, needs, and context of use should guide

development

Active user involvement Users should actively participate throughout the development process

and system lifecycle

Evolutionary system

development

The system’s

development should be both iterative and incremental

Simple representations The design must be represented in ways that it can be easily understood

by users and other stakeholders

Prototyping Prototypes should be used early and continuously in cooperation with

users

Explicit and conscious

design activities

The development activities should contain design activities dedicated to

user interactions

Professional attitude The development process should be conducted by effective

multidisciplinary teams

Usability champions Experts should be involved throughout development

Holistic design All aspects that can influence use situations should be developed in

parallel

To sum up, the main focus of the principles from both Norman (1988) and Gulliksen et

al. (2003) is the usability in terms of effectiveness, efficiency, and user satisfaction. For

example, Gulliksen et al. (2003) emphasized the role of usability experts and various aspects that

could potentially affect use situations. The principles suggested in Norman (2002) were also

focused on providing specific design guidelines for ensuring system usability. Another important

concept in both principles is user experience, which is the entire context in which users interact

with a system or product (Alben, 1996; Jakob Nielsen & Norman, 2014). These principles also

share a common characteristic in that they consider UCD as a process rather than a physical or

visual aspect of a system or product. In Norman (2002), the principles were stated as a set of

action statements for practitioners to follow, and Gulliksen et al. (2003) emphasized the user

involvement and iterations in design processes as well as the formation of teams and attitudinal

aspects. These UCD principles can be incorporated together by using various methods of

26

evaluation for development systems. These can be used as methods of usability and user

experience evaluation.

2.2.2 Usability

The goal of UCD is to improve systems to have a high degree of usability. ISO 9241-

210 (2009) defines usability as the "extent to which a system, product or service can be used by

specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a

specified context of use." According to the ISO definition, usability is not a single, one-

dimensional property of a user interface but has multiple components, which can be measurable

elements such as effectiveness, efficiency, and satisfaction, as follows.

• Efficiency: the level of resources consumed in performing tasks

• Effectiveness: the ability of users to complete tasks using the technology and the quality

of the output of those tasks

• Satisfaction: users’ subjective satisfaction with using the technology

Usability testing (UT) is a method that observes users while they navigate a site to

perform specific tasks. UT has been widely acknowledged as a fundamental method to evaluate

products and systems. It is an evaluation method in which users directly participate (J. S. Dumas

& Redish, 1999; Jeffrey & Chisnell, 1994; Jakob Nielsen, 1994a; Plaisant & Shneiderman,

2005), as opposed to usability inspection methods (Cockton, Woolrych, Lavery, Sears, & Jacko,

2009; Jakob Nielsen, 1994b) and model-based inspection (Kieras, 2009).

There are several dimensions to usability, such as effectiveness, efficiency, and

satisfaction. These dimensions can be measured by using quantitative and qualitative methods.

For example, task completion, accuracy recall, and quality of outcome can be used to measure

effectiveness (Bayles, 2002; S. T. Dumas, Cutrell, & Chen, n.d.; Marshall, Foster, & Jack,

27

2001). Time, mental effort, and usage patterns can also be used to measure efficiency (Drucker,

Glatzer, De Mar, & Wong, 2002; Golightly, Hone, & Ritter, 1999). Preference, ease of use, and

specific attitudes can be used to measure satisfaction (Gutwin, 2002). To collect these data, the

most frequently methods used in usability testing are think-aloud protocol, eye movement

tracking, heuristic evaluation, questionnaires and surveys, and interviews and focus groups. The

think-aloud method is among the most common evaluative methods in usability testing. It was

first introduced in software development around 1980 (Lewis & Mack, 1982). While no standard

yet exists, numerous variations of the method have been employed. The most accepted

definition, described by Jakob Nielsen (1994a), is that “In a thinking aloud test, evaluator ask

test participants to use the system while continuously thinking out loud — that is, simply

verbalizing their thoughts as they move through the user interface.”

The think-aloud method enables researchers to understand how the user thinks about the

software while using it. Users complete their tasks at the computer in a testing room equipped to

capture audio and video. If the user is not talking, the evaluator encourages the user to say what

he/she is thinking. A typical think-aloud method is conducted with five to ten participants. Each

evaluation session lasts typically for one to two hours.

There are mainly five steps in this variant of the think-aloud method: (a) Greeting the

participant; (b) Data gathering on the participant’s background; (c) Observing the participant

interact with the software solving predefined tasks; (d) Debriefing the participant; (e) Thanking

the participant for coming.

The primary benefit of think-aloud methods is that they provide input to designs from

‘actual’ users who are representative of the user population. Think-aloud encourages the person

being studied to verbalize their thoughts, thereby resolving the reasons for the behavior being

observed (Gray & Wardle, 2013). The simplicity of the approach is also beneficial in utilizing

think-aloud methods, as Ramey et al. (2006) and Preece et al. (2002) reported that this method

28

can gain a vast number of results quickly. Furthermore, Neilsen (1993) points out that the think-

aloud method allows evaluators to collect a wealth of qualitative data with a fairly small number

of users. In addition, Neilsen (1993) described flexibility as a strength of the think-aloud

method, as it can be used at any stage in the development lifecycle, from early paper prototypes

to fully implemented, running systems.

Many studies about think-aloud protocols in usability testing of online learning systems

followed (Armstrong, 2011; Barzilai & Zohar, 2012; Boren & Ramey, 2000; Damico & Baildon,

2007; Guan, Lee, Cuddihy, & Ramey, 2006; Olmsted-Hawala, Murphy, Hawala, & Ashenfelter,

2010). For instance, Armstrong (2011) employed the think-aloud method in the context of online

learning and focused on how undergraduate university students use and evaluate educational

websites. Aranyi and his colleagues (2012) also used the think-aloud method to conduct an

exploratory study of the interaction experience with a news website. They provided five

categories of experience based on the participants’ evaluative statements: impression, content,

layout, information, architecture, and diversion (Aranyi et al., 2012). Similarly, Barzilai and

Zohar (2012) examined epistemic thinking in action using the think-aloud method to figure out

the relationship between students’ knowledge construction and their online practices.

2.3 Motivation

Motivation, which is directly related to curiosity, persistence, learning, and performance,

is one of the most important psychological concepts in education (Deci & Ryan, 1985b). Deci et

al. (1991) define academic motivation as students’ interest in learning, valuing of education, and

confidence in their own capacities and attributes. Several researchers have applied motivation

theory in their attempts to discover what motivates individuals to succeed academically. This

29

section will discuss the development of research in the area of academic motivation and

introduce self-determination theory.

2.3.1 Cognitive Evaluation Theory

Deci and Ryan (1985a) have contributed much to the theory of motivation by

developing Cognitive Evaluation Theory (CET) based on findings from numerous empirical

studies. This theory explains how external factors (e.g., rewards, feedback, deadlines) support or

thwart intrinsic motivation by increasing or diminishing one’s feeling of autonomy or

competence. Deci and Ryan (1985a) explained that all external factors have two functional

aspects: a controlling aspect and an informational aspect. That is, informational external events

increase intrinsic motivation, whereas controlling external events decreases it. For example, if

students experience a reward as an informational event allowing choice and providing

competence-relevant feedback, it enhances a student’s sense of autonomy and competence,

which, in turn, support intrinsic motivation. However, if the reward is used as a controlling

aspect of an external event in order to pressure students toward a specific outcome or way of

behaving, it diminishes intrinsic motivation.

Over 100 laboratory experiments and field studies have been conducted to support or

refute CET based on various types of rewards and other external events and their corresponding

effects on intrinsic motivation. The first laboratory study to test the effects of reward on intrinsic

motivation was conducted by Deci (1971). He recruited 24 college students and asked them to

complete a puzzle-solving task called the SOMA puzzle. The participants in the treatment group

were paid $1.00 for each puzzle they solved, while the participants in the control group received

no reward for participating. The experimenter provided free time to participants in the middle of

the puzzle-solving session during which they could complete the puzzle-solving task as they

30

pleased while the experimenter observed them and recorded the time that each participant spent

engaged in the task. The experimenter used these observations to measure participants’ intrinsic

motivation. The results of this study showed that the participants who were paid money to play

spent significantly less time playing on their own than participants who were not rewarded for

playing. However, Deci (1971) also conducted another experiment in which he varied the type

of reward from monetary to verbal while all other variables remained constant. He found that

those who received verbal rewards played longer than those who received monetary rewards

alone or both monetary and verbal rewards together, resulting in an increase of intrinsic

motivation.

Lepper, Greene, and Nisbet (1973) also found that reward had detrimental effects on

intrinsic motivation. In this study, participants aged between 40 and 64 months were observed in

a free play period to determine their initial intrinsic interest in a drawing activity. A “Good

Player Award” card was used as a reward for extrinsic factors. While the experimenters did not

provide the reward information to participants in the unexpected-award and the no award

groups, they assigned the Good Player Award to participants in the expected-award group. The

results of the study found that children in the expected-award group spent less time on the

drawing activity than others in either the unexpected-award or no reward groups. These results

indicated that subjects who received expected rewards experienced decreases in intrinsic

motivation due to the reward serving as a controlling agent. Since the early studies conducted by

Deci (1971) and Lepper, Greene, and Nisbet (1973), a number of studies have investigated the

effects of external reward on intrinsic motivation in an educational setting. These studies found

that extrinsic rewards such as surveillance (Lepper & Greene, 1975), deadlines (Amabile,

DeJong, & Lepper, 1976), imposed rules and limits (Koestner, Ryan, Bernieri, & Holt, 1984),

imposed goals (Mossholder, 1980), competition (Deci, Betley, Kahle, Abrams, & Porac, 1981),

and evaluation (Ryan, 1982) decrease intrinsic motivation. A later meta-analysis of 128

31

laboratory experiments confirmed that, whereas positive feedback enhances intrinsic motivation,

tangible rewards significantly undermine it (Deci, E.L., Koestner, R., & Ryan, 1999).

However, since CET assumes that all motivation stems from intrinsic motivation, this

theory cannot explain the circumstance in which activities are not intrinsically interesting but are

completed as required duties. As a result, Ryan & Deci (2000) further developed the concept of

extrinsic motivation in what became known as Self-Determination Theory (SDT).

2.3.2 Self-Determination Theory

Self-Determination Theory (SDT), together with research on individual differences in

motivational orientations, contextual influences, and interpersonal perceptions (Ryan & Deci,

2000), explains how extrinsically motivated behavior can become autonomous. Autonomous

motivation encompasses both intrinsic motivation and well internalized extrinsic motivation

such as integrated and identified regulation. SDT proposes that three forms of motivation exist,

namely, intrinsic motivation, extrinsic motivation, and amotivation that, based on the level of

autonomy associated with each, lie on a continuum ranging from high to low self-determination

respectively (Figure 2-8).

32

Figure 2-8 The spectrum of motivation according to SDT (Adapted from Ryan RM, Deci EL.

2000.)

Intrinsic motivation, the first form of motivation under SDT, can be defined as the doing

of an activity for its own sake: the activity itself is interesting, engaging, or in some way

satisfying. When intrinsically motivated, a person performs on a voluntary basis in the absence

of external contingencies (Deci and Ryan, 1985a). Intrinsic motivation is thought to constitute

the most autonomous form of motivation, which satisfying the needs to feel competent,

autonomous, and related (Deci and Ryan, 1985a). Thus, activities that lead the individual to

experience these feelings are intrinsically rewarding and are likely to be performed again. The

second form of motivation is extrinsic motivation which refers to the doing of an activity, not for

its inherent satisfaction, but to attain some separable outcome, such as reward or recognition or

praise from other people. Since external reasons that motivate individuals to perform can differ,

SDT specifies different subtypes of extrinsic motivation depending on how internalized the

motivation is. These multidimensional extrinsic motivations are divided into external,

introjected, identified, and integrated regulations, and they can vary from an entirely external

locus of causality to an internal locus of causality, as well as from lower to higher self-

33

determination. External regulation can be defined as the doing of an activity to satisfy an

external demand or to obtain an external reward; this regulation represents the least autonomous

forms of extrinsic motivation. Introjection and identification are both combinations of internal

and external loci of causality. While introjected regulation behaviors are more controlled by

external loci of causality, identification regulation behaviors are more controlled by internal loci

of causality. The last type of extrinsic motivation is integrated regulation, which is the most

autonomous form of extrinsic motivation. This behavior occurs when identified regulations have

been assimilated to the self (Deci et al., 1991). Finally, amotivation is the last form of motivation

and is considered to have the lowest level of autonomy on the continuum of motivational styles.

Markland & Tobin (2004) define amotivation as “a state lacking of any intention to engage in

behavior.” Several studies have tested key SDT constructs in both lab-based and classroom

settings, making SDT one of the most empirically validated theories for understanding

educational motivation.

As we discussed above, in consideration of intrinsic motivation and extrinsic motivation,

the most central distinction in SDT is between autonomous motivation and controlled motivation

(Deci & Ryan, 2008). Behavior-regulated autonomous motivations are based on the experiences

of volition, psychological freedom, and reflective self-endorsement. Controlled motivation

consists of two types of external motivations such as external and introjected regulation.

Behavior-regulated controlled motivation is a function of external contingencies of reward or

punishment that pressure a person to think, feel, or behave in particular ways. In education

fields, numerous researchers have conducted experiments in which they used academic

motivation to predict students’ learning and performance (Ayub, 2010; Fortier, Vallerand, &

Guay, 1995; Herath, 2015; Maurer, Allen, Gatch, Shankar, & Sturges, 2012; Park et al., 2012;

Sturges, Maurer, Allen, Gatch, & Shankar, 2016; Turner, Chandler, & Heffer, 2009). For

example, Fortier et al (1995) recruited 263 9th grade students and asked them to complete a

34

questionnaire for academic motivation to test the relationship between autonomous academic

motivation and school performance. They found that the students who have higher autonomous

academic motivation showed higher performance in school. This result revealed that

autonomous forms of motivation increase academic performance. Recently, Herath (2015) also

investigated how college students’ motivation may explain learning outcomes in information

systems courses. He recruited 160 undergraduate students and 109 graduate students and asked

them to complete an academic motivation scale questionnaire. He found that intrinsic motivators

are positively related to student perceptions of affective and cognitive learning. However, he

failed to find a strong effect of intrinsic motivation on learning in overall grades or exam grades.

In addition, he observed that extrinsic motivation has a greater effect on participation grades,

suggesting that students who identify external reasons for learning the material tend to put forth

more effort in assignments and in-class activities.

2.3.3 Self-Determination Theory for Gamification

Most studies of gamification examine only reward-based gamification systems. For

example, students often receive points when they complete a predefined task in the gamification

system. These points can then be converted into levels or rankings and can also be used in a

leaderboard to encourage competition between students. For this reason, gamification has

already become a controversial pedagogical tool, critiqued within a CET framework for

diminishing students’ intrinsic motivation. On the other hand, gamification provides students

with a non-controlling setting in which the implementation of game elements may indeed

improve intrinsic motivation by satisfying users’ innate psychological needs for autonomous

motivation (Deterding, 2014; Francisco-Aparicio et al., 2013; Pe-Than et al., 2014; Peng et al.,

2012). In addition, Deterding (2011, 2012) suggested the need for better understanding of the

35

psychological mechanisms underlying gamification. However, currently very few studies have

attempted to empirically investigate the effects of game elements on motivation and

performance (Deterding, 2011; Hamari et al., 2014; Seaborn & Fels, 2015). Furthermore, these

studies have not considered the quality of motivation (i.e., intrinsic and extrinsic motivation).

Even though there is no previous study regarding the application of SDT to

gamification, several researchers have applied SDT in investigating motivation in computer

games and game-based learning. For example, Przybylski (2006) found that the basic

psychological needs of intrinsic motivation predicted both enjoyment and future game play.

Sheldon & Filak (2008) supported this conclusion, finding that the three basic psychological

needs of autonomy, competence, and relatedness within a game-learning context predicted

students’ affect and performance. Thus, it is important to satisfy students’ basic psychological

needs of autonomy, competence, and relatedness in a gamification context. Competence, which

is the need to be effective and master a problem in a given environment, can be achieved

through certain game elements. Difficult goals encourage higher expectations which in turn

increase performance, and the completion of a task leads to a sense of competence and higher

satisfaction, ultimately leading to an increase in intrinsic motivation. For example, points can be

used to quantify different goals. A level or progress bar visually indicates the player's progress

over time, thereby providing sustained feedback. Badges serve as visual symbols of

achievement, supporting the competence component of self-determination theory. Leaderboards

permit social comparison and a means to display competence to one’s peers. Thus, the feedback

function of these game design elements evoke feelings of competence, as this directly

communicates the success of a player's actions. Autonomy, which is the need to control one’s

own life, can be understood in a learning context as the ability of learners to make choices about

how they learn with opportunities to take responsibility for their own learning. Since an

individual’s control over his or her experience is thought to be a crucial component of active

36

learning and is key to the concepts of self-determination theory, it is very important that

gamification should provide learners with as much control as possible. If gamification provides

multiple paths to achieve the goal, it is possible for players to prioritize and choose which paths

are most relevant to them. For example, avatars offer players freedom of choice, while

leaderboards and feedback encourage engagement and fulfill the need for relatedness (the need

to interact and be connected with others) by providing a choice for learners to either collaborate

with or compete among their peers.

2.4 Personality

Personality traits refer to individual differences that explain the unique and consistent

patterns of cognitions, behavior, and emotions shown by individuals in a variety of situations

(MacKinnon W., 1944). This definition is the source of the Big Five personality factors that are

commonly addressed in educational psychology (Hogan, Hogan, & Roberts, 1996). These

factors include extraversion, agreeableness, conscientiousness, neuroticism, and openness to

experience. Extroversion includes the traits of sociability, spontaneity, and adventurousness.

Agreeableness is characterized as tending to be honest, courteous, acquiescent, and kind. While

conscientiousness is linked to responsibility, dependability, and orderliness, neuroticism is

characterized by insecurity, emotional instability, and immaturity. Openness to experience is

associated with curiosity, flexibility, intellect, and originality (O’Brien & DeLongis, 1996).

Since 2010, more than 1000 empirical studies have investigated the moderating effects

of personalities on motivation (Clark & Schroth, 2010; Komarraju et al., 2009; Ö nder et al.,

2014), learning styles (Busato et al., 1998; Frunham, 1996; Zhang, 2003), and computer-based

learning (Buckley & Doyle, 2017; Cohen & Baruth, 2017; Haron & Suriyani Sahar, 2010;

Nakayama, Mutsuura, & Yamamoto, 2014; Reza & Khan, 2014). Komarraju et al. (2009) have

37

examined how personality is related to academic motivation. They found that openness to

experience positively correlated with intrinsic motivation; neuroticism and extraversion

positively correlated with extrinsic motivation; and conscientiousness positively correlated with

both intrinsic and extrinsic motivation. In addition, agreeableness and conscientiousness both

correlated negatively with amotivation. Showing results similar to those of a previous study

conducted by Komarraju et al. (2009), Clark & Schroth (2010) also demonstrated a relationship

between personality and academic motivation. They suggest considering students’ personality

when devising teaching, planning, and learning strategies.

Several studies were conducted to investigate the moderating effects of personalities on

learning styles, as measured by the Index of Learning Styles (ILS), and their results suggest

complex links between learning styles and personality traits. Frunham (1996) studied the relation

between learning styles and the Big Five personality factors. He found that conscientiousness

was associated positively with meaning-, reproduction-, and application-directed learning styles,

and openness to experience correlated positively with meaning- and application-directed

learning styles. Conscientiousness and openness to experience were associated negatively with

an undirected learning style. In contrast, neuroticism correlated positively with an undirected

learning style and negatively with meaning- and reproduction-directed learning styles. Busato et

al. (1998) also investigated the relationship between learning styles and the Big Five personality

factors, and their results show the same trends as Furnham’s study. In addition, Zhang (2003)

showed that conscientiousness and openness to experience predicted a deep approach to learning

and that neuroticism is a good predictor for the surface approach to learning.

Previous studies on computer-based learning claim that personality affects the adoption

of and familiarization with e-learning (Buckley & Doyle, 2017; Cohen & Baruth, 2017; Haron &

Suriyani Sahar, 2010; Nakayama et al., 2014; Reza & Khan, 2014). For example, Haron and

Suriyani Sahar (2010) found that extroversion, agreeableness, and emotional stability have

38

significant negative correlations with the adoption of e-learning, while high conscientiousness

and openness to experience correlate positively with the adoption of e-learning. Nakayama et al.

(2014) investigated how a student’s personality can affect learning within an online course

environment, and their results showed that the learner's personality affects his/her note-taking

behavior while learning online. Recently, Cohen and Baruth (2017) investigated the relationship

between learners' personality and their satisfaction within an online academic course. They

reported that openness to experience and conscientiousness are positively associated with the

satisfaction of online courses.

39

Chapter 3

The Effects of Gamification on Engineering Lab Activities

The following chapter is from the manuscript: “The Effects of Gamification on

Engineering Lab Activities”, Eunsik Kim, Ling Rothrock, and Andris Freivalds, published in

Frontiers in Education Conference (FIE), 2016.

3.1 Introduction

The purpose of this study was to explore through empirical evidence the effects of

gamification on students’ performance of engineering lab activities.

The following hypotheses were developed:

H1: The gamification system will motivate students

H2: The gamification system will increase students’ engagement

H3: The gamification system will increase students’ performance

3.2 Methods

Two types of websites were created to collect data from students who enrolled an

undergraduate introduction human factors course (IE327) taught at The Pennsylvania State

University in the fall semester of 2015. This course is a first-level junior course required for all

baccalaureate students in the Department of Industrial and Manufacturing Engineering and was

selected for this study because it includes a lab activity with more than 100 students. The course

has 6 lab sections in which the maximum number of students is 24. In the first week of lab

activity, we introduced the background and purpose of this study as well as the research question

40

and data-collection websites. Only students who wanted to participate in this study were then

asked to join the websites and practice the activities. They were also to take the general

knowledge test. This study received institutional review board (IRB) approval from

Pennsylvania State University.

3.2.1 Websites

For this study, we established websites of two types in which students could create their

own multiple-choice questions (MCQs) and answer the questions authored by classmates, as

based on a previous study (Denny, 2013). The two types of websites were Gamification (GM)

and Non-Gamification (NG). While the GM website included game elements including a Badge

System, Score, Avatar, Leaderboard, Level, and Feedback (Notification), the NG website was a

traditional website without game elements. Several previous studies showed that having students

create their own questions is an effective learning technique that also helps students to develop

self-regulating skills (Foos, 1989; Nicol, 2007). Furthermore, according to Bloom’s revised

taxonomy, to have students create their own questions requires them to employ the most

advanced step in the learning process, “Creating,” which involves designing, constructing,

planning, producing, inventing, devising, and making (Anderson et al., 2001). Examples of the

websites as seen by the students are shown in Figure 3-1. In these websites, when the students

created questions, they had to also provide an explanation for the correct answer in their own

words. These explanations appeared with the correct answer whenever other students submitted

their own answers. When students answered the questions authored by their classmates, they

also had to provide their opinion about whether they agreed with the correct answer or not,

evaluate the difficulty and quality of the question, and write comments about the question in

their own words. They also needed to decide whether to “follow” the author of the question or

41

not. The “follow” function enables students to view the questions created by specific authors in

the first row among the unanswered questions.

The game elements were only available in the GM website. Scores for students in this

website were calculated by an algorithm based on the number of questions authored as well as

the number of answers given and the feedback provided by other students. This score was then

used to determine level and ranking for competition between the students. The GM website

included two more pages, called “View my badges” and “Leaderboard.” Samples of these two

pages are shown in Figure 3-2. In the “View my badges” page, students could identify what

they earned from the activity. A total of 27 kinds of badges were available. Examples of the

badges with their descriptions are shown in Table 3-1. In the “Leaderboard” page, students could

see where they ranked on the website in relation to their classmates. They could also see

information about specific rankings such as Username, Level, Score, the number of earned

badges, questions, and answers.

42

Figure 3-1 The main menus of the two websites: (A) Gamification (B) Non-Gamification

43

Figure 3-2 Samples of the (A) “View my badges” and (B) “Leaderboard” pages

44

Table 3-1 Examples of the badges with descriptions

Category Badge Name Description

Basic

1. Author For contributing your first question on the Website

2. Answerer For answering your first question on the Website

3. Comment For the first time you write a comment about a question

4. Evaluator For the first time you either ”agree” or ”disagree” with a

previous comment written by class mates

5. Follower For the first time you ”follow” a question author

6. Inspector For the first time you give a feedback about Correct Answer

7. Commitment For answering at least 10 questions, on each of 3 different

days

Standard

1. Prolific Author For contributing at least 20 questions, on each of 5 different

days

2. Good Answerer For answering at least 50 questions, on each of 5 different

days

3. Reporter For writing comments for at least 20 questions, on each of 5

different days

4. Voter

For submitting at least 50 either ”agree” or ”disagree” with

previous comments written by classmates, on each of 5

different days

5. Supporter For following at least 10 question authors, on each of 3

different days

6. Rater For submitting at least 30 feedbacks about correct answers, on

each of 5 different days

7. Diligence For contributing questions, on 3 consecutive days

8. Earnest For contributing questions, on 10 consecutive days

9. Scholar For correctly answering at least 10 questions in a row, on 3

different days

10. Genius For correctly answering at least 20 questions in a row, on 5

different days

11. Strivers For answering questions, on 10 consecutive days

Advance

1. Leader For at least 20 followers

2. Professor For 5 questions you contribute that receive at least 5 ”hard”

ratings for the difficulty of question

3. Good Author For 5 questions you contribute that receive at least 10 ”very

good” rating for the quality of question

4. Congressman For writing at least 5 comments that receive an ”agreement”

5. Night Owl Author For contributing a question between 12 AM and 2 AM

6. Night Owl Answer For answering a question between 12 AM and 2 AM

7. Early bird Author For contributing a question between 7 AM and 9 AM

8. Early bird Answer For answering a question between 2 AM and 9 AM

9. Popular Author For a question you contribute that is answered at least 50

times

45

3.2.2 Design of experiment

A total of 140 students enrolled in the course and its 6 lab sections in the fall semester of

2015. For the purpose of this study, students who wanted to participate were randomly assigned

to the experimental or control groups based on their lab sections. Students in sections 1, 3, and 5

were assigned to the NG group, while students in sections 2, 4, and 6 were assigned to the GM

group. The first phase of the study for students who wanted to participate was conducted with

Biomechanical Analysis of Lifting and CTD and Screwdriver Design lab materials. All students

who participated in the first phase received extra credit amounting to 0.5% of overall course

grade and could receive an additional 1% extra credit if they met the minimum requirement of

creating 3 questions and answering 18 questions. In the GM group, if students were ranked in

the top 5%, they received a further 1% extra credit. In the NG group, students who created every

additional five questions or answered 15 questions after meeting the minimum requirement

received an additional 0.1% extra credit up to 1%. Since the ranking system was one of the game

elements, we gave additional extra credit to the NG group based on the members’ effort at

creating and answering questions. To balance the GM and NG groups for additional extra credit,

the number of questions and answers required for additional extra credit in the NG group was set

based on the previous study (Landers, Callan, Freitas, & Liarokapis, 2011). In the second phase

of this study, all participating students were assigned to the opposite group; students assigned to

the NG website in the first phase were assigned to the GM website and vice versa. The second

phase for students who wished to participate was conducted with Time Study lab materials. All

students participating in the second phase received extra credit via the same method. Figure 3-3

shows the timeline on which this study was conducted. To avoid the possibility of students

completing all the required contributions in one day, students could not create more than 5

questions per day or answer more than 15 questions per day.

46

Figure 3-3 Timeline of the experiment

3.3 Results

To answer the first research question, frequency analysis was conducted. The numbers

of students who joined the websites for each phase are shown in Table 3-2.

In the first phase, of the 67 students in the GM group and 73 in the NG group, 48 and 50

respectively joined and participated in each website. We observed a higher participation rate in

the second phase compared with the first, showing that 61 of 73 in the GM group and 51 of the

67 in the NG group joined the websites. The signup rate for the GM website was 3.1% greater

than that for the NG website for the first phase and 7.5 % greater for the second phase compared

to lower rates in both phases (68.5% and 76.1% respectively).

Table 3-2 The number of students who joined each type of website for each phase

1st Phase 2nd Phase

GM website NG website GM website NG website

Participants 48 (71.6%) 50 (68.5%) 61 (83.6%) 51 (76.1%)

All 67 73 73 67

47

The summary of website activities, including the number of questions authored, the

number of answers submitted, and the number of distinct days of activity between the two types

of websites for each phase are shown in Table 3-3. Distinct day was defined as the number of

days on which a student was considered to be active on the assigned website, either authoring or

answering at least one question.

To answer the second research question, a two-sample t-test was conducted to determine

the significance of the differences in website activities between the two groups for each phase.

The number of questions authored by students was not significantly different between

the two groups for both phases (1st phase: t (94) = -0.27, p = 0.788; 2nd phase: t (80) = 0.41, p =

0.683). However, both the quality and difficulty of the questions showed significant differences

between the two groups for both phases (Quality: 1st phase: t (459) = 3.65, p = 0.000, 2nd phase:

t (468) = 12.29, p = 0.000; Difficulty: 1st phase: t (352) = 2.79, p = 0.006, 2nd phase: t (556) =

9.29, p = 0.000). The number of answers and comments and the percentage of correct answers

were significantly different between the two groups in the first phase (Answer: t (92) = 2.03, p =

0.045; Comments: t (468) = 3.50, p = 0.001; Correct Answers: t (3043) = 2.17, p = 0.030);

however, in the second phase those factors were not significantly different between the two

groups (Answer: t (109) = 0.15, p = 0.878; Comments: t (659) = -1.87, p = 0.063; Correct

Answers: t (5129) = 1.40, p = 0.162). Finally, the number of distinct days showed a significant

difference between the two groups for both phases (1st phase: t (79) = 1.99, p = 0.050, 2nd

phase: t (86) = 2.09, p = 0.040).

48

Table 3-3 The summary of website activities between two groups for both phases

Activity Web 1st Phase 2nd Phase

N Mean(SD) P Value d N Mean P Value d

The number of

Questions

GM 48 5.06(6.4) 0.788 0.06

61 7.3(26) 0.683 0.07

NG 50 5.4(5.94) 51 5.9(10)

The Quality of

Questions

GM 244 2.29(0.91) 0.000*** 0.32

448 2.68(0.56) 0.000*** 1.04

NG 270 1.91(1.44) 299 1.99(0.86)

The Difficulty of

Questions

GM 244 1.01(0.29) 0.006** 0.24

448 1.3(0.59) 0.000*** 0.63

NG 270 0.87(0.75) 299 1.02(0.17)

The number of

Answers

GM 48 42.3(30.2) 0.045* 0.41

61 49.6(77.1) 0.878 0.02

NG 50 28.3(37.6) 51 47.5(65.7)

The number of

Comments

GM 244 2.27(3.06) 0.001*** 0.31

448 1.38(2.45) 0.063† 0.15

NG 270 1.4(2.49) 299 1.72(2.34)

The percentage

of Correct

Answers

GM 2002 0.80(0.4)

<0.03* 0.08

3027 0.81(0.39)

0.162 0.04 NG 1459 0.77(0.42) 2426 0.79(0.41)

The number of

Distinct Days

GM 41 10.54(6.54) 0.050* 0.42

51 14.1(5.57) 0.040* 0.43

NG 47 7.97(5.69) 47 11.36(7.2)

† p < .1 * p < .05 **p < .01 ***p < .001.

The students’ performance was compared between the GM group and the NG group for

each phase based on the students’ scores on the general knowledge test, midterm, and final

exam. The difference between the general knowledge score and midterm score was used as the

performance data for the first phase of the study, and the difference between the general

knowledge score and final exam score was used as the performance data for the second phase.

The exams were graded by the course instructor who was not directly involved in the

gamification components of the lab exercises. To answer the third research question, a two-

sample t-test was conducted to determine the significance of the differences in the exam scores

between the groups for each phase, and the results are shown in Figure 3-4. The students’

performance was significantly different between the two groups for all phases (1st phase: t (83)

= 2.12, p = 0.037; 2nd phase: t (83) = 2.20, p = 0.030).

To investigate whether the sequence of students’ assignment to the GM and NG groups

affected the level of student engagement, we compared the difference in students’ activities

49

between the two sequences only for students who participated in both phases using a two-sample

t-test. The control group consisted of the students who used the GM system in the first phase and

the NG system in the second phase (GM to NG). The treatment group consisted of the students

who used the NG system in the first phase and the GM system in the second phase (NG to GM).

Figure 3-5 to Figure 3-7 shows the results in terms of the number of questions authored, answers

submitted, and distinct days, respectively. For the number of questions authored, there was no

significant difference between the two groups. However, there were significant differences

between the two groups for the number of answers submitted and distinct days (Answer

submitted: t (78) = -1.98, p = 0.05; Distinct day: t (59) = 2.31, p = 0.024).

Figure 3-4 The difference of exam scores between the two groups for both phases

50

Figure 3-5 The difference of the number of questions between two groups (GM to NG, NG to

GM)

Figure 3-6 The difference of the number of answers between two groups (GM to NG, NG to

GM)

51

Figure 3-7 The difference of the number of distinct days between two groups (GM to NG, NG to

GM)

3.4 Discussion

Gamification is a fast-growing approach used to encourage user motivation and

engagement in non-gaming environments. In a general survey of the literature, Seaborn and her

colleagues (Seaborn & Fels, 2015) found that over 750 articles have been written on

gamification. With respect to engineering education, Bodnar & Clark (2014) surveyed 191

papers that used games in engineering classes. The findings of both Seaborn and Bodnar agree

that gamification resulted in an overall positive experience for users and students. However, both

researchers’ findings also concur that the majority of applied work is not grounded in theory and

does not use a standard gamification framework. Moreover, from the surveyed literature, only a

small subset demonstrated a systematic approach to analytically assessing the benefits of

implementing games (or game components) in the classroom (Bodnar & Clark, 2014). In this

context, a major gap exists in that gamification research needs to be improved based on

52

empirical validation of the effectiveness of various gamification methods, which we tried to

address by comparing GM and NG groups.

Two types of websites, GM and NG, were created to explore the effects of gamification

on engineering lab activities by providing empirical evidence of the effect of gamification on

students working in an engineering lab. The results suggest that gamification had a positive

effect in terms of motivation, engagement, and performance on engineering lab activities,

indicated by the higher number of students who joined the GM website, the higher number of

answers submitted to the GM website (in the first phase), and the higher number of distinct days

of participation for students in the GM group. Although gamification did not have the impact of

increasing the number of questions authored by students in the GM group for either phase, the

difficulty and quality of questions authored by students in the GM group were higher than those

of the NG group in each phase. These results indicate that the GM group invested greater time

and effort in creating questions compared with the NG group.

Despite these differences in participation and engagement, in the second phase the

number of answers submitted did not differ significantly between the two groups. One potential

reason may be that most students in both groups selected answering questions as the best method

for preparing final exam, which had a great impact on their grades. The total number of answers

submitted by both groups combined increased by around 2000 in the second phase. The number

of questions created by students did not differ significantly between the two groups in either

phase. One possible explanation for this is that creating their own questions presented the

greatest challenge for students. Each student created only an average of 6 questions for each

phase, and this number is only half of the average number of distinct days for all students in both

groups. This phenomenon was also found in previous studies (Denny, 2013; Denny, Luxton-

Reilly, & Hamer, 2008) and may be due to the fact that creating their own questions requires

greater time and effort for students when compared to other activities. Another potential reason

53

is that, since there were one or two lab materials in which students had to create questions, it was

difficult to avoid repeated questions.

Additionally, we investigated whether the sequence in which the system was used

between GM and NG affected the level of student engagement by comparing two groups, one

for each sequence. Since the general trend of decreasing motivation over time is shown in the

literature (Eccles, Wigfield, & Schiefele, 1998; Pintrich & Schunk, 2002; Zusho, Pintrich, &

Coppola, 2003), we expected that the activity of students in both groups might decrease in the

second phase compared with the first phase. However, students’ activities did not decrease in

either group. Furthermore, the increase in the treatment group was significantly greater than that

in the control group. Based on the slope between the two phases for each group, we can explain

the effect of gamification on students’ motivation, which shows a significant difference in the

number of questions submitted and distinct days.

Based on our results, we can conclude that a positive effect of gamification on student

learning in engineering lab activities was ultimately found.

Since this is the first and pilot study, it has several limitations. Though our study focused

primarily on the effectiveness of gamification on engineering lab activities, future research

should consider which elements of gamification had the greatest impact on students’ motivation,

engagement, and performance. In addition, further research is required to consider real-time

evaluation of student behavior by using an eye-tracking system.

54

Chapter 4

An Empirical Study on the Impact of Lab Gamification on Engineering

Students’ Satisfaction and Learning

The following chapter is from the manuscript: “An Empirical Study on the Impact of Lab

Gamification on Engineering Students’ Satisfaction and Learning”, Eunsik Kim, Ling Rothrock,

and Andris Freivalds, published in International Journal of Engineering Education, 2018

4.1 Introduction

The present study determines whether gamification has positive effects on engineering

lab activities in terms of motivation, engagement, and learning outcome. This study is an

extension of the previous chapter, which only considered the data from gamification systems,

leaving several open questions about students’ perspective. Thus, we evaluated our gamification

systems based on student perceptions in terms of active learning, motivation, and game elements

by using responses to a questionnaire (see Appendix). In addition, we examined the relationship

between the level of gamification engagement and learning outcomes. Lastly, we investigated

which game elements best motivated students and facilitated their enjoyment. The following

hypotheses were developed:

H1: The students’ perception of how the gamification system is helpful to active

learning will affect their level of gamification engagement and learning outcomes.

H2: The students’ perception of how the gamification system motivates them will affect

their level of gamification engagement and learning outcomes.

55

H3: The students’ perception of how often they keep track of game elements in the

gamification system will affect their level of gamification engagement and learning

outcomes.

H4: There is a significantly positive relationship among students’ perceived effect of the

gamification system, the level of gamification engagement, and learning outcomes.

4.2 Method

The course chosen to include gamification systems and used for data collection was

Introduction to Work Design, a course in the Department of Industrial and Manufacturing

Engineering at The Pennsylvania State University. This course is a first level junior course

required for all undergraduate students in the department. There are two reasons why this course

was selected: (1) it is one of the largest classes required for all undergraduate students in the

department and (2) it uses active-learning strategies and hands-on experiments within the

integrated laboratory class.

4.2.1 Website

For this study, we established websites of two types: Gamification (GM) and non-

gamification (NG) websites, as shown in Figure 4-1. While the GM website included several

game elements such as a badge system, score, avatar, leaderboard, level, and feedback

(notification), the NG website was a traditional website without game elements. In these

websites, students were requested to conduct two main activities, as based on a previous study :

(1) create their own multiple-choice questions (MCQs) and (2) solve the questions generated by

classmates (Denny, 2013). Question generation is the one of the most effective learning methods

56

because it requires students’ attention to content and main ideas and evaluates whether content

was understood in an active learning strategy. Numerous previous studies for the past few

decades have indicated the positive results in terms of the enhancement of comprehension of

learned content (Hanus & Fox, 2015; Stavljanin, Milenkovic, & Sosevic, 2016) and the

promotion of motivation (Chin & Brown, 2002), group communication (Fu‐Yun Yu et al.,

2005), and higher-order cognitive skills (Brown & Walter, 2004; Drake & Barlow, 2008).

57

Figure 4-1 the main page of the two websites: (A) Gamification (B) Non-gamification

For creating questions, students were asked to write their own questions and to provide

one correct answer, four alternative answers, and an explanation for the correct answer for each

question. This explanation appeared with the correct answer. For answering questions, students

were asked to solve the question, check the correct answer, and evaluate the question. There was

58

an evaluation page as shown in Figure 4-2. When students answered the questions authored by

their classmates, this page appeared asking students to indicate whether they agreed with the

correct answer and how they would rank the quality and difficulty of the question. Students were

also given the option write their opinion about the question by providing feedback in the

comment area. In addition, there was a function labeled “follow,” which enabled students to

view the questions written by selected authors in the different unanswered questions table.

Figure 4-2 Example of evaluation page in website

In the GM website, scores for students were calculated using an algorithm based on the

number of questions authored, the number of correct answers given, and the feedback provided

by other students as follows:

• When students created a question, they received up to 1000 points.

o When students registered a question, they received 300 points (basic score).

o If students received a greater amount of ‘yes’ feedback than ‘no’ feedback for the

correct answer, they then received an additional 200 points.

59

o Additional points up to 500 were available depending on the mean value of the

quality rating.

• When students correctly answered a question authored by peers, they received 200 points.

• Students could receive up to 5000 points per day.

Students could not create more than 5 questions per day or answer more than 15

questions per day in order to prevent students from completing all the required contributions in

one day This score was then used to determine level and ranking for competition between the

students. The detailed information for the websites is available in our previous study (Kim,

Rothrock, & Freivalds, 2016).

4.2.2 Participants and procedure

A total of 140 students enrolled in the course, and its 6 lab sections completed a series of

two activities throughout the fall semester of 2015. Students interested in participation in this

activity were randomly assigned by course section to the GM or NG groups based on their lab

sections. For example, students in sections 1, 3, and 5 were assigned to the NG group, while

students in sections 2, 4, and 6 were assigned to the GM group.

For the first phase of the study, Biomechanical Analysis of Lifting and CTD and

Screwdriver Design lab materials were used, and for the second phase of the study Time Study

lab materials were used. These materials have many variables, limitations, and questions that

make it difficult to memorize them, and students can easily become confused. Furthermore,

Biomechanical Analysis of Lifting and CTD and Screwdriver Design constitute around 50% of

the midterm, and Time Study constitutes around 50% of the final exam. By participating in this

study, students received extra credit up to 5% of their overall course grade, as summarized in

Table 4-1.

60

Table 4-1 Summary of extra credit for participating in this study

Group

GM NG

Sign up Extra Credit 0.5% 0.5%

Minimum Requirement

Extra Credit (3Q + 15 A) 1.0% 1.0%

Additional Extra Credit students were ranked

in the top 5% 1.0%

Every additional 5

Q or 25 A

0.1%

(up to 1.0%)

For the second phase of this study, students in the NG group at the first phase were

assigned to the GM website and vice versa in order to give both groups equal opportunity to

experience both educational environments.

On the second week (first lab) of the semester, we described this study in detail

including background and purpose of the study as well as the research question and data-

collection websites. Only students who were interested in participation were then asked to join

the websites and practice the activities after lab activity. They were also to take the general

knowledge test. From the 5th week to midterm, the first phase of this study was conducted. After

that, students were asked to complete the questionnaire developed to figure out students’

perspectives and satisfaction.

The second phase of this study was conducted from the 13th week to final exam week.

After that, students were asked to complete the questionnaire again. The detailed timeline of this

study is shown in Figure 4-3.

61

Figure 4-3 Timeline of the experiment

4.2.3 Measurement and Data analysis

We developed the two questionnaires (Table A and B in the Appendix): one to measure

each student’s perceived effect of gamification and active learning strategy on motivation and

learning outcomes, and another to figure out students’ perspective for specific game elements

including open-ended questions. The first questionnaire was composed of 16 items. The students

were asked to respond on a 5-point Likert scale, from 1 (never or strongly disagree) to 5 (always

or strongly agree). The second questionnaire consisted of 4 short answer questions regarding

game elements and one open-ended question.

All statistical analysis was performed using the Statistical Package for Social Sciences

(SPSS). A factor analysis with the Varimax method was performed on the data to explore the

possible structure of the questionnaire. In extracting factors, only the factors having eigenvalues

greater than 1 were considered significant. We conducted frequency analysis for the responses to

the questionnaire in order to categorize them into the following 3 groups: high (responses upper

62

75 percentile), moderate (responses between 25 to 75 percentiles) and low (responses under the

25th percentile). Then these groups were used to conduct the ANOVA test to determine

hypotheses H1 to H3. The correlation analysis was also performed on questionnaire results,

learning outcomes, and gamification engagement for hypothesis H4. Learning outcomes refers to

a student’s exam score and gamification engagement refers to the ‘score’ in the gamification

system based on the algorithm described previously. We conducted frequency analysis for

students’ answers to short answer questions regarding game elements. To analyze open-ended

questions regarding students’ suggestions for the next gamification system, we extracted

keywords based on similar words or that expressed the same idea from students’ responses and

conducted a frequency analysis using those keywords. The non-gamification group was asked to

complete only one questionnaire (Table C in the Appendix) regarding active learning strategy

questions because there were no game elements. We conducted a paired t-test to investigate

whether gamification can be used as a supporting tool for an active learning strategy in order to

sustain students’ motivation.

4.3 Results

4.3.1 Demographic statistics

Among a total of 140 students, only 86 results could be analyzed because 54 students

did not fully complete all measurements. Of those 86, 51 were male and 35 were female, 42

students regularly played the game while 44 students did not regularly play the game, and 60

were already familiar with game elements.

63

4.3.2 Factor analysis

An exploratory factor analysis yielded a 13-item scale and three factors, which together

accounted for 69% of the variance and received an acceptable value on the Kaiser-Meyer-Olkin

measure of sampling adequacy (KMO 0.806). All the variables received acceptable values of

communality, ranging from 0.516 to 0.856. Also, the Bartlett’s Test of Sphericity was significant

(p=.000), suggesting that the data was suitable for performing the exploratory factor analysis.

The first factor accounted for 48.8% variance (eigenvalue=6.348). The second factor accounted

for 13.2% of the variance (eigenvalue=1.719). The last factor accounted for 9.7% variance

(eigenvalue=1.256). All items displayed loadings above .50 on all factors as shown in Table 4-2.

The first factor was labeled as Active Learning, composed of 5 items, describing only main

activities in the gamification system such as the effect of question generation and answering

questions. The second factor of five items was named Game Elements because all questions are

related to game elements such as badges, ranking, and score. The last factor included 3 items

regarding motivation questions. Thus, this factor was labeled as Motivation.

64

Table 4-2 Results of factor analysis from questionnaire

AL GE M Cronbach

Alpha

Mean

correlations

with other

scales

Q1. Creating my own questions was an effective

way of learning in this class .873

.903 .338

Q2. Answering questions that were created by

classmates was an effective way of learning in

this class

.843

Q3. These activities (creating and answering

questions) were helpful in preparing for exams. .797

Q4. This website was helpful for preparing

exams. .786 .308

Q5. There was a sufficient number of questions

to learn the lab material. .719

Q6. Did you actively try to score (earn points)? .839

.844 .275

Q7. Did you actively try to earn badges? .796 .304

Q8. Did you keep track of your level? .336 .657 .345

Q9. Did you keep track of your ranking? .504 .642

Q10. Did you keep track of the number of

followers? .415 .503 .301

Q11. These game elements increased my

enjoyment of doing activities. .894

.800 .363 Q12. These game elements motivated me to

participate more than I would have otherwise. .816

Q13. I think that my level of involvement was

high. .489 .604

Eigenvalues 6.348 1.719 1.256

KMO(Kaiser-Mayer-Olkin) 0.806

Bartlett's Test of Sphericity

Approx. Chi-Square 815.494

df 78

Sig. .000

Extraction Method: Principal Component Analysis.

Rotation Method: Varimax with Kaiser Normalization.

AL: Active Learning; GE: Game Elements; M: Motivation

4.3.3 Descriptive Analysis

Mean and standard deviation were used as descriptive statistics for questionnaire

response, gamification engagement, and learning outcomes as shown in Table 4-3 to Table 4-5,

65

respectively. The highest mean value was 3.6163 in Active Learning and the lowest mean value

was 2.8907 in Motivation. Table 4-4 contains detailed information about gamification

engagement including the number of Questions, Answers, and Badges; the quality of Question;

and the Score. Score was calculated based on the number of Questions, number of Answers, and

the quality of Question as described previously. The mean values of the number of Questions,

Answers, Badges, and Distinct Days are 7.33. 56.35, 7.79 and 5.84, respectively. In Table 4-5,

the pre-test refers to the general knowledge test and the pro-test refers to the midterm and final

exam score. The mean values of the pre-test and the exam scores were 29.53 and 73.22,

respectively.

Table 4-3 Descriptive statistics for questionnaire response

Items N Mean SD

Active Learning 86 3.6163 0.88342

Motivation 86 2.8907 0.93992

Game Elements 86 3.3451 0.95044

Table 4-4 Descriptive statistics for gamification engagement

Items N Mean SD

The number of Questions 86 7.33 22.15

The quality of Question 86 2.02 1.03

The number of Answers 86 56.35 72.58

The number of Badges 86 7.79 4.51

The number of Distinct Days 86 5.84 6.54

Score 86 15884.84 17470.62

Table 4-5 Descriptive statistics for learning outcomes

Items N Mean SD

Pre-Test 86 29.53 13.97

Exam Score 86 73.22 20.67

66

4.3.4 Hypothesis testing

H1: The students’ perception of how the gamification system is helpful to active learning

will affect their level of gamification engagement and learning outcomes.

The results of one-way ANOVA showed a significant main effect for gamification

engagement and learning outcomes (F(2,83) = 70.11, p<0.01 for gamification engagement;

F(2.83) = 70.36, p<0.01 for learning outcomes) (Table 4-6). A post-hoc analysis of the main

effect indicated that the mean values of both gamification engagement and learning outcomes

were significantly higher in the ‘High’ group as compared to the ‘Moderate’ and ‘Low’ groups,

respectively. The mean values of both gamification engagement and learning outcomes for the

‘Moderate” group were also higher than that of the ‘Low” group. This data provides convincing

evidence that the more that students have a positive perception of active learning in the

gamification system, the higher the level of gamification engagement and the higher the learning

outcomes will be, resulting in acceptance of Hypothesis 1.

Table 4-6 ANOVA analysis of active learning for gamification engagement and learning

outcomes

Active Learning

Low (n=22) Moderate (n=42) High (n=22) F Sig. Post Hoc

Mean (SD) Mean (SD) Mean (SD)

Gamification

Engagement

2931.82

(3500.26)

10704.76

(3926.70)

38727.09

(20421.00) 70.11 0.000*** H>M>L

Learning

Outcomes

69.73

(8.05)

80.70

(5.08)

91.09

(5.08) 70.36 0.000*** H>M>L

† p < .1 * p < .05 **p < .01 ***p < .001.

H2: The students’ perception of how the gamification system motivates them will affect

their level of gamification engagement and learning outcomes.

The ANOVA results for H2 are shown in Table 4-7. There was a significant difference

between the level of students’ perception of motivation for the gamification engagement and

learning outcomes (F(2,83) = 3.50, p<0.05 for gamification engagement; F(2.83) = 11.11,

67

p<0.01 for learning outcomes). A post-hoc analysis of the main effect also showed significantly

higher mean values of both gamification engagement and learning outcomes in the ‘High’ group

as compared to ‘Moderate’ and ‘Low’ groups, respectively. Thus, findings support Hypothesis 2.

Table 4-7 ANOVA analysis of Motivation for gamification engagement and learning outcomes

Motivation

Low (n=22) Moderate (n=42) High (n=22) F Sig. Post Hoc

Mean (SD) Mean (SD) Mean (SD)

Gamification

Engagement

9766.4

(13340.5)

15258.1

(15574.11)

23200.14

(22086.89) 3.50 0.035* H>M=L

Learning

Outcomes

73.4

(11.93)

81.8

(7.08)

85.25

(7.73) 11.11 0.000*** H>M=L

† p < .1 * p < .05 **p < .01 ***p < .001.

H3: The students’ perception of how often they keep track of game elements in the

gamification system will affect their level of gamification engagement and learning outcomes.

The differences in gamification engagement and learning outcomes by the level of

students’ interest in game elements were clear. The one-way ANOVA test at a significance level

of 0.05 revealed statistically significant differences in gamification engagement and learning

outcomes, respectively, by the level of students’ interest in game elements (F(2,83) = 11.617,

p<0.01 for gamification engagement; F(2.83) = 9.433, p<0.01 for learning outcomes) (Table

4-8). A post-hoc analysis revealed that the ‘high’ and ‘moderate’ group scores were significantly

higher than those of the ‘low’ group when comparing the adjusted mean for all three groups. The

significantly better score of the ‘high’ and ’moderate’ groups as compared with that of the ‘low’

group supports Hypothesis 3.

68

Table 4-8 ANOVA analysis of Game Element for gamification engagement and learning

outcomes

Game Elements

Low (n=21) Moderate (n=44) High (n=21) F Sig. Post Hoc

Mean (SD) Mean (SD) Mean (SD)

Gamification

Engagement

8471.43

(7404.43)

12923.

3(14159.83)

29502.1

(23229.28) 10.99 0.000*** H=M>L

Learning

Outcomes

76.29

(9.16)

79.3

(9.27)

87.5

(7.52) 9.43 0.000*** H=M>L

† p < .1 * p < .05 **p < .01 ***p < .001.

H4: There is a significantly positive relationship students’ perceived effect of the

gamification system, the level of gamification engagement, and learning outcomes

Table 4-9 presents the correlation results among students’ perceived effect of the

gamification system, the level of gamification engagement, and learning outcomes. All factors

have significant correlation with gamification performance (p < 0.01), which supports

Hypothesis 4.

Table 4-9 Correlation results among gamification performance, the questionnaire results, and

learning outcomes (n=86)

Learning

Outcomes

Gamification

Engagement

Active

Learning Motivation

Game

Elements

Learning

Outcomes

Pearson

Correlation 1 - -

- -

Sig. (2-tailed) - - - - -

Gamification

Engagement

Pearson

Correlation .782 1 -

- -

Sig. (2-tailed) .000*** - - - -

Active

Learning

Pearson

Correlation .896 .781 1

- -

Sig. (2-tailed) .000*** .000*** - - -

Motivation

Pearson

Correlation .540 .439 .471 1 -

Sig. (2-tailed) .000*** .000*** .000*** - -

Game

Elements

Pearson

Correlation .645 .679 .600 .543 1

Sig. (2-tailed) .000*** .000*** .000*** .000*** - † p < .1 * p < .05 **p < .01 ***p < .001.

69

4.3.5 Additional findings

In addition to the analyses related to the main research hypotheses, further analysis was

conducted. Table 4-10 presents the results of two-tailed paired t-tests between the gamification

group and the non-gamification group for active learning strategy perception. For most

questions, there were significant differences between the two groups. Except for Question 3, the

gamification group showed higher mean values.

Table 4-10 Results of two-tailed paired t-test

Website df Mean Std.

Deviation T Sig.(2-tailed)

d

Q1 G 86 3.79 0.975

2.96 0.004*** 0.40 N 86 3.430 0.790

Q2 G 86 3.512 1.186

1.69 0.095† 1.23 N 86 2.244 0.853

Q3 G 86 3.465 1.145

-0.90 0.369 0.14 N 86 3.605 0.786

Q4 G 86 3.581 0.964

1.81 0.073† 0.29 N 86 3.302 0.959

Q5 G 86 3.826 0.910

2.09 0.039*

0.35

N 86 3.512 0.904

† p < .1 * p < .05 **p < .01 ***p < .001.

4.3.6 Frequency analysis

The result of frequency analysis indicated that 80% of students were motivated by

‘Ranking’ and ‘Score’ and 50% of students felt fun due to ‘Badges, ‘Feedback’, and ‘Avatar’.

Students chose ‘Ranking’ and ‘Score’ as the game elements to be retained in the new

gamification system. 23 % of students answered that all game elements need to be included in a

new gamification system, while 22% of students and 21% of students thought ‘Avatar’ and

‘Leaderboard’ were unnecessary, respectively.

70

Figure 4-4 Results of frequency analysis for short answer question

Table 4-11 shows the results of frequency analysis for keywords. Suggestions regarding

‘Question’ were most frequent, followed by ‘Design’ and ‘Function.’ There were only 4 students

who suggested changes to ‘Game Elements.’

71

Table 4-11 The results of frequency analysis for open ended question

Keyword Example Frequency

Question

“the questions reviewed by TA to check they are relevant to course material

and helpful,” “Have TA/professor also submit questions,” “should not have

same questions asked,” “Filter out questions,” “question filtering,” “It seemed

that it was tough to avoid students posting some repeated questions, so maybe

if the questions were sorted by topic it would be easier to see what has already

been posted, and avoid asking the same question,” “I would say this kind of

activities should include more chapters”

24

Website

Error “Many questions were cut off,” “score error” 8

Design

“the font should be made larger,” “have a better dashboard for the user,”

“Improve layout,” “improving the user interface would be a great way to

retain everything in a more engaging way,” “More interactive,” “It's annoying

to open a question one at a time and then having to leave feedback too,”

“Make the UI actually make sense (click buttons rather than text, etc.) and

don't shorten questions in the list view, show the whole thing,” “I believe the

activity was already good maybe make interface more modern”

19

Game

Element

“remove the ranking thing,” “top scorers discourage others,” “allow to upload

image to be used avatar” 4

Function

“Provide detailed explanation of how to earn points,” “Make the instructions,”

“provide help function,” “And adding a feature which allows you to contact to

TA/professor,” “make an interactive tutorial”

16

4.4 Discussion

The purpose of this study was to investigate the effects of lab activity gamification on

students’ motivation, engagement, and learning outcome based on students’ performance and

students’ perspective. The research results indicated support for all hypotheses. In other words,

gamification had positive effects not only on students’ learning outcomes, but also on students’

learning satisfaction.

From hypotheses H1 and H4, we have concluded that the more that students have

positive perception of active learning in the gamification system, the higher the level of

gamification engagement and the higher the learning outcomes will be. Previous studies of

active learning that seek to increase student motivation suggest developing supporting

mechanisms to motivate students and thereby to increase student engagement (Chang et al.,

72

2012; Umetsu et al., 2002; Fu‐Yun Yu et al., 2005). In this study, we have proven that

gamification can be used as a supporting tool for an active learning strategy to sustain students’

motivation. Furthermore, students’ perception of active learning had high and significant

correlation with gamification engagement (r = 0.781) and with learning outcomes (r = 0.896).

The two-tailed t-test results revealed that the gamification group showed higher mean values for

most questions regarding the active learning method as compared with the non-gamification

group. One question did not show a significant difference between the two groups, since the

main activities of question generation and question solving were exactly the same between the

two websites.

From hypotheses H2 and H4, we have concluded that students’ motivation was

positively associated with gamification engagement and learning outcomes. The result was

consistent with prior research, all of which indicated that gamification has a positive effect on

students’ motivation, engagement, and learning performance (de Freitas & de Freitas, 2013;

Iosup & Epema, 2014; O’Donovan, Gain, Marais, Donovan, & Marais, 2013).

From hypotheses H3 and H4, we showed the positive effects of game elements in a

gamification system on gamification engagement and learning outcome. Then we also explored

which game elements best motivated and facilitated the enjoyment of students. For motivation,

most students were extrinsically motivated. Among 78 students, 61 students selected game

elements such as ‘Ranking,’ ‘Score,’ or ‘Level’ which were directly related to their extra credit

in the course, while 17 students selected game elements such as ‘Badge,’ ‘Feedback,’ and

‘Avatar.’ These research results may be expected because the previous studies point out that one

of the shortcomings of gamification focuses on extrinsic motivation (Chin & Brown, 2002; FY

Yu & Liu, 2008). However, for the question regarding enjoyment, around 50% of students (22

for badges; 8 for feedback; 6 for avatar) responded that game elements that were not related to

reward facilitated their enjoyment. These results suggest that students’ extrinsic motivation can

73

be converted into intrinsic motivation by using game elements such as badge and feedback to

increase their enjoyment as mentioned by previous studies (Bogost, 2011; Glover, 2013). For the

questions regarding which game element should be retained in the next gamification system,

'Ranking' was selected by the most students, followed by 'Score' and 'All of Them,' showing a

trend similar to that of the first question. In response to the first question, most students

answered no game element should be removed in the next gamification system, and ‘Avatar’ and

‘Leaderboard’ were the second highest response. One possible explanation is a large gap

between students on the leaderboard. Students may have recognized that it is impossible to

advance from their current position on the leaderboard.

In the open-ended question regarding their suggestion for the next gamification system,

24 students suggested changes related to questions. For example, students suggested covering

more topics to avoid repeated questions, sorting questions by topic, submitting questions written

by the TA/professor, and so on. 19 students and 16 students made suggestions regarding

‘Design’ and ‘Function,’ respectively. They requested improving the website’s overall usability

and adding ‘Help’ and ‘Tutorial’ functions.

In summary, the results of this study suggest that the application of gamification in

engineering lab activities as a supporting tool has a positive effect on students’ motivation,

engagement, and learning outcome based on the relationship between students’ performance and

students’ perspective. Although previous studies have shown the effects of gamification in an

educational setting, these prior studies only focused on either students’ performance or students’

satisfaction (Deci & Ryan, 1990; Denny, 2013; Domínguez Saenz-de-Navarrete, J., de-Marcos,

L., Fernández-Sanz, L., Pagés, C., & Martínez-Herráiz, 2013; Dong et al., 2012; Ryan & Deci,

2000b). In our research, we emphasize the consistency between students’ performance in and

subjective satisfaction with the gamification system by analyzing the level of gamification,

74

learning outcomes, and questionnaire results. In addition, game elements such as ranking, score,

and badge can motivate students and make them feel that an educational activity is fun.

75

Chapter 5

Investigating the Impact of Personality Traits and a User-Centered Design

Process in a Gamified Laboratory Context

5.1 Introduction

In this study we explored the role of students’ personality traits in the effects of

gamification in terms of motivation, engagement, and learning outcomes. We also explored how

to build an effective gamification system by applying the UCD process. We expect that this

study will be a crucial first step in documenting much significant information regarding how to

develop an effective gamification system. Thus, the following hypotheses were developed:

H1: The ease-of-use of the gamification website will increase students’ engagement in

gamification website activity.

H2: Different personality traits among students will lead to different levels of

gamification engagement and different learning outcomes.

H3: Different personality traits among students will yield different student self-

perceptions of (1) how the gamification system helps active learning, (2) how the

gamification system motivates them, and (3) how often they keep track of game

elements in the gamification system.

5.2 Pilot Experiment

5.2.1 Method

The pilot experiment is a usability test for the improvement of a gamification system.

The purpose of the UT is to identify the needs and the interaction problems for the users/students

76

when they use the gamification system as a supporting tool for their learning. We used the

concurrent think-aloud method in which students were encouraged to “think out loud” while

using the gamification system. We then developed a new gamification system for this study

based on the results from UT. Finally, these two gamification systems were used for comparing

the effects of gamification in terms of motivation, engagement, and learning outcomes.

5.2.1.1 Participants and procedure

The UT was conducted in an on-campus lab with 5 participants (3 females, 2 males)

with an average age of 21 (SD: .71). The decision to conduct UT using 5 participants was based

on previous research in which it was reported that testing 5 users lets experimenters find almost

as many usability problems as testing many more users (J Nielsen & Landauer, 1993). We

recruited participants exclusively from among those students who had participated in

gamification activity for a previous study conducted in the fall semester of 2015, since the same

course materials were included in this gamification system (Kim, Rothrock, & Freivalds, 2018).

All participants were native speakers of English and received an incentive of $10 per hour for

their time. Each student volunteer was scheduled for a private, ninety-minute user session for the

study.

Participants were presented with an introduction to the study and were given the

opportunity to ask questions regarding the experiment. Consent was obtained prior to the start of

the experiment. There were two sections in the UT: (1) a predefined task and (2) a free task. In

the predefined task, there were 10 kinds of tasks regarding game elements and main activities

such as creating questions, searching the specific course material for answers to questions,

changing avatars, checking their/others’ ranking, and so on. In the free task, participants were

asked to engage in gamification as they normally do for 10 minutes. Audio recording was also

77

used to capture participants’ voices during the experiment. After completing the UT,

participants’ verbal protocols were transcribed into verbatim text. We performed verbal protocol

analysis on the transcripts and decomposed each participant’s transcript into individual segments

to capture and represent single units of thought. Inter-rater reliability of the selected sentences,

as determined by Cohen’s kappa, ranged from .68 to .78. Since the purpose of this experiment is

to identify usability problems in the gamification system, the portions of each transcript that

were not related to usability problems were ignored. We identified usability problems from user

comments regarding confusion, misunderstandings, or difficulties the user experienced.

5.2.2 Results

Through detailed analysis of the transcripts, we identified a total of 25 unique usability

problems with 5 categories: (1) Design, (2) Navigation, (3) Game Element, (4) Main Activity,

and (5) Feedback. Table 5-1 shows an example and the number of detected problems in each

category. “Design problem” refers to a problem with the general design and layout including

content, font, color, placement, and images. A “Navigation problem” is a problem a user

encounters while navigating the website to complete the activities. “Game element problem”

refers to a specific problem regarding game elements such as avatar, badges, score, or ranking. A

“Main activity problem” is a problem a user encounters when creating and answering a question.

From these results, recommendations for modification of the website were developed. The

largest number of usability problems was identified in the category of Main Activity. For

example, 4 out of 5 participants failed to find specific types of questions that they wanted to

solve. 4 out of 5 participants commented that they wanted to resolve the question they had

answered incorrectly, but that there was no option to retry the question. In addition, 2 out of 5

participants failed to find the method to show the whole question when a lengthy question was

78

truncated. 3 out of 5 participants complained of too many redundant questions. To solve these

problems, a question category, question sort function, retry function, remove redundant question

function, and video tutorial were added. The second largest number of usability problems was

about design. Since each participant had a different preference for design of the website

including font size, menu, and specific icon positions, color, etc. we made the gamification

website customizable in terms of design. Thus, a user, if he/she wants, can change font size,

menu, and specific icon positions, color, etc. Other comments were also applied to the

modification of the website. Figure 5-1 and Figure 5-2 provide examples of the website pre- and

post-modifications. In Figure 5-2, added functions are identified with a red rectangle.

Table 5-1 Summary of usability problems

Category Example

Number of

Usability

Problems

Design “Want to change the avatar position,” “Larger font,” “Place

shortcut on the right side” 6

Navigation

“Nothing happens when I click it. I guess there is a view my bad

ges tab. I can click it. I’m not sure anything is happening,” “No

way to contact to TA in here”

5

Game

Element

“No function on avatar,”

“Gonna click the view my badges tab, and I can see all the badge

s I have, but I am not sure what they mean”

3

Main Activity

“I am gonna hover over the view button. I guess i can’t really sea

rch for that question. They aren’t sorted by product,”

“You can't really tell what the question is about, what topic it is,”

“No information about correct answer if I submit wrong answer”

8

Feedback “It doesn’t tell me what the correct answer was so I’m not really

sure,” “Not really sure how many points each question is worth” 3

79

Figure 5-1 Example pages from the pre-modification gamification system: (a) main page, (b)

question list page, and (c) a question response page

80

Figure 5-2 Example of pages from the post-modification gamification system: (a) main page 1,

(b) main page 2, (c) question list page, (d) check answers page, (e) a question response page, and

(f) a question evaluation page.

5.3 Experiment

5.3.1 Method

One of the courses provided by the Department of Industrial and Manufacturing

Engineering at The Pennsylvania State University was chosen to collect the data for this study.

This course, Introduction to Work Design, is one of the largest classes in the department with

more than one hundred students and requires hands-on experiments within the integrated

81

laboratory class. For the experiment, we used the gamification system developed in a previous

study (Kim et al., 2016). The main concept of gamification systems is “learning by teaching” in

which students generate their own multiple-choice questions (MCQs) regarding course materials

and solve the questions authored by classmates. Several game elements such as a badge system,

score, avatar, leaderboard, level, and feedback (notification) were included to increase students’

motivation and engagement in this system.

5.3.1.1 Participants and procedure

The experiment was conducted in the fall semester of 2016 with 62 students (25

females, 37 males) with an average age of 20.51 (SD: .70). Participating students self-selected

from among 105 students enrolled in the course. We used two types of gamification systems: an

Initial Version (IV) and a User Centered Designed Version (UCD). 50 of the 62 students were

already familiar with game elements. 32 students regularly played the game, while 30 students

did not play the game at all. There were 5 lab sections in this course with up to 24 students in

each section. This course was selected for this study because it is a first level junior course

required for all undergraduate students in the department. Students who volunteered to

participate in this activity were assigned to the IV or UCD group based on lab section. Students

in sections 1, 3, and 5 were assigned to the IV group at the first phase and UCD group at the

second phase. The students from sections 2 and 4 were placed in groups opposite that of the

other students during each respective phase. Among the eleven lab materials, we selected three

for the first phase: Biomechanical Analysis of Lifting, Cumulative Trauma Disorders (CTD),

and Screwdriver Design. For the second phase, we selected one lab material: Time Study.

Students who participated in this study received extra credit up to 2.5% of their final grade for

each phase in which they participated based on their performance in gamification activity as

82

summarized in Table 5-2. For students who did not participate in this study, we provided another

extra credit option in order to avoid equity issues.

Table 5-2 Summary of extra credit for participating in this study

Extra Credit

Sign up Extra Credit .5%

Minimum Requirement Extra Credit (3Q + 15 A) 1.0%

Additional Extra Credit only if students were ranked in the top 5% 1.0%

We introduced the gamification website and detailed information about this study

including its background and purpose to students in the second week (first lab) of the semester.

Students who were voluntarily willing to participate in this study were asked to sign a written

consent form and were given the opportunity to ask questions regarding the experiment. They

were also asked to complete the International Personality Item Pool (IPIP) questionnaire and the

general knowledge test. In addition, they had practice time for gamification activities such as

authoring questions and answering questions created by their classmates. The first phase took

place from the 5th week of the semester to midterm (through week 8). During the first phase,

students could conduct the gamification activities as frequently as they wanted, but were limited

to creating no more than 5 questions per day and answering no more than 15 questions per day.

The week after midterm, students were asked to complete a questionnaire on their perspectives

on and satisfaction with gamification activities. From the 13th week through final exams (week

16), the second phase of this study was conducted using the same procedure as the first phase

with students moved into the group opposite that to which they belonged in the first phase.

83

5.3.1.2 Measurement

Gamification activities

Students’ gamification activities were measured using two type of gamification systems:

IV and UCD. In these gamification systems, students were asked to create their own questions

regarding lab materials and answer the questions created by other students during the 8 weeks of

the combined first and second phases (4 weeks per phase). Students received points whenever

they created their own question and when they answered a question created by another student

correctly. Students received 300 points whenever they created a question. When students

answered a question, they were also asked the evaluation question “Do you agree with correct

answer.” If this question received more “Yes” than “No” responses from other students, the

author of the question received an additional 200 points. Students received up to 500 additional

points based on the average quality of their question as evaluated by their classmates on a scale

from 1 to 5. Therefore, students could achieve a maximum score of 1000 points per question.

Students also received 200 points for answering a question correctly and zero points for

answering incorrectly. The detailed algorithm is shown in Table 5-3. Students’ scores were then

used to determine level and ranking for competition between the students. The gamification

systems also included a limitation function that prevented students from completing all of the

required contributions in one day: Students could not create more than 5 questions per day or

answer more than 15 questions per day. More detailed information on the gamification websites

is available in a previous study (Kim et al., 2018).

84

Table 5-3 Score algorithm in gamification systems

Creating a question Points Answering a question Points

Basic score 300 Correct answer 200

Feedback score 0 or 200 Wrong answer 0

Quality score Up to 500

Questionnaire for students’ perception of the effects of gamification, active learning, and

motivation

A questionnaire developed in a previous study conducted by Kim et al. (2018) was used

to measure each student’s perceptions of (1) how the gamification system helps active learning,

(2) how the gamification system motivates them, and (3) how often they keep track of game

elements in the gamification system. This questionnaire consisted of 13 items ranked on a 5-

point Likert scale, from 1 (never or strongly disagree) to 5 (always or strongly agree). Students

were asked to complete this questionnaire after each phase. The previous study showed the

following Cronbach alpha coefficients: Active learning .90, Game elements .84, and Motivation

.80. Cronbach alphas in the current study were: Active learning .901, Game elements .813, and

Motivation .831, indicating very good internal consistency.

Questionnaire for personality traits

Students’ personality traits were estimated by the 50-question version of the Big Five

factor lexical structure, which is part of the International Personality Item Pool (IPIP) (Goldberg,

1992). This questionnaire was selected because of its convenient availability and ease of

administration, and because it is a widely accepted and utilized source of various personality

scales. The five factor components are extroversion, agreeableness, conscientiousness,

neuroticism, and openness to experience. Each factor was measured by ten items with a 5-point

Likert scale. The original alpha coefficients were: extraversion .87, agreeableness .82,

conscientiousness .79, neuroticism.86, and openness to experience .84, suggesting good to very

85

good internal consistency (Goldberg, 1992). In this study, Cronbach alphas were: extraversion

.75, agreeableness .71, conscientiousness .68, neuroticism .72, and openness to experience .78,

suggesting acceptable internal consistency in all factors except for conscientiousness.

5.3.1.3 Data analysis

Statistical Package for Social Sciences (SPSS) was used to perform all the statistical

analyses including a paired t-test and correlation analysis. The paired t-test was used to evaluate

hypothesis H1 and the correlation analysis was performed for hypotheses H2 and H3. Learning

outcomes can be defined as the difference between the pre-test score and exam score.

Gamification engagement refers to the score in the gamification system based on the algorithm

described previously.

5.3.2 Results

H1: The ease-of-use of the gamification website will increase students’ engagement in

gamification website activity.

The results of a two-sample t-test for website activities, including the number of

questions authored, the number of answers submitted, and the number of distinct days of activity

between both websites for each phase are shown in Table 5-4. Distinct Days refers to the number

of days on which a student was active on the assigned website, either creating or answering at

least one question. The number of questions authored by students was not significantly different

between the two groups in either phase. However, students in the UCD group answered

significantly more questions than those of the IV group in both phases (first phase: t (35) = -

2.89, p = .007; second phase: t (58) = -1.94, p = .058). Finally, the number of distinct days and

86

the number of badges showed a significant difference between the two groups for both phases

with a higher mean value in the UCD group (Distinct Days: first phase: t (39) = -4.06, p < .001,

second phase: t (59) = -5.60, p < .001; Badges: first phase: t (37) = -3.82, p = .001, second

phase: t (57) = -6.20, p < .001). Additionally, the students’ preferences between IV and UCD

systems were investigated as shown in Figure 5-3. 38 out of 62 students (61.29%) answered that

the UCD system was more helpful for learning, 42 students (67.74%) answered that the UCD

system better motivated them to participate, and 43 students (69.35%) answered that the UCD

system increased their enjoyment of the website. Overall, 47 students out of 62 (75.80%)

preferred to use the UCD system. These results provide convincing evidence that the ease-of-use

has a positive effect on increasing the level of engagement for gamification website activity,

resulting in acceptance of Hypothesis 1.

Table 5-4 The summary of gamification activities between two groups for both phases

Activity Website 1st Phase 2nd Phase

N Mean(SD) P Value d N Mean(SD) P Value d

Number of

Questions

IV 38 5.45(3.76) .182 -.4

23 7.52(7.64) .983 -.01

UCD 24 7.46(6.58) 39 7.56(7.25)

Quality of

Questions

IV 38 2.30(1.19) .019* -.61

23 3.18(.55) .36 -.07

UCD 24 3.00(1.06) 39 3.15(.35)

Difficulty

of

Questions

IV 38 1.15(.57)

<.001*** -.88

23 1.88(.22)

<.001*** 1.46 UCD 24 1.64(.53) 39 1.51(.27)

Number of

Followers

IV 38 3.42(4.55) .17 -.34

23 1.61(2.81) .054† -.44

UCD 24 2.04(3.25) 39 3.82(6.00)

Number of

Answers

IV 38 21.08(15.81) .007** -.82

23 38.96(22.48) .058† -.23

UCD 24 36.96(23.83) 39 54.23(44.00)

Number of

Comments

IV 38 6.87(9.98) .247 -.32

23 15.39(30.40) .24 .4

UCD 24 10.42(12.49) 39 7.56(8.82)

Number of

Distinct Days

IV 38 13.61(3.18) <.001*** -.1.12

23 15.13(2.51) <.001*** -1.28

UCD 24 17.63(4.14) 39 20.18(4.58)

Number of

Badges

IV 38 5.58(2.50) .001*** -.108

23 5.35(2.57) <.001*** -.15

UCD 24 8.75(3.55) 39 10.18(3.53) † p < .1 * p < .05 **p < .01 ***p < .001.

87

Figure 5-3 Results of students’ preferences between IV and UCD systems

H2: Different personality traits among students will lead to different levels of

gamification engagement and different learning outcomes.

Table 5-5 shows the correlation results among students’ personality traits, their level of

gamification engagement, and learning outcomes. There was a positive correlation between the

level of gamification engagement and learning outcomes (r = .68). For the level of gamification

engagement, all personality traits except for agreeableness showed significant relationships.

While extraversion, conscientiousness, and openness to experience were positively correlated

with the level of gamification engagement with r values of .44, .56, and .22 respectively (p < .1

and p < .05), neuroticism negatively correlated with the level of gamification engagement with r

values of -.41 (p < .05). Learning outcomes displayed a similar trend. There was positive

correlation between conscientiousness and learning outcomes with an r value of .59 (p < .05).

However, a negative correlation was identified between neuroticism and learning outcomes with

an r value of -.43 (p < .05). These results support Hypothesis 2.

88

Table 5-5 Correlation results among gamification engagement, learning outcomes, and students’

personality traits (n=62)

1 2 3 4 5 6 7

1. Gamification

Engagement

Pearson

Correlation 1 - - - - - -

Sig. (2-tailed) - - - - - - -

2. Learning

Outcomes

Pearson

Correlation .68 1 - - - - -

Sig. (2-tailed) <.001*** - - - - - -

3. Extraversion

Pearson

Correlation .44 .19 1 - - - -

Sig. (2-tailed) <.001*** .132 - - - - -

4. Agreeableness

Pearson

Correlation .05 .12 .1 1 - - -

Sig. (2-tailed) .721 .354 .44 - - - -

5.

Conscientiousness

Pearson

Correlation .56 .59 .37 .18 1 - -

Sig. (2-tailed) <.001*** <.001*** .002** .172 - - -

6. Neuroticism

Pearson

Correlation -.41 -.43 -.33 -.18 -.18 1 -

Sig. (2-tailed) .001*** .001** .008** .172 .16 - -

7. Openness to

experience

Pearson

Correlation .22 .26 .19 .17 .26 -.03 1

Sig. (2-tailed) .09† .039* .13 .183 .044* .818 - † p < .1 * p < .05 **p < .01 ***p < .001.

H3: Different personality traits among students will yield different student self-

perceptions of (1) how the gamification system helps active learning, (2) how the gamification

system motivates them, and (3) how often they keep track of game elements in the gamification

system.

The relationship between students’ personality traits and students’ perceptions of active

learning, motivation, and game elements was investigated using the Pearson correlation

coefficient. The results of this analysis can be seen in Table 5-6. For students’ perception of

active learning and game elements, there were significantly positive correlations with two

personality dimensions: extraversion and conscientiousness (p < .1 and P <.01). Students’

perception of motivation showed significantly positive correlation (p < .1, P < .01, and P < .001)

89

with extraversion, agreeableness, and conscientiousness. Neuroticism showed significantly

negative correlation (p < .05 and P < .01) with students’ perception of all factors.

Table 5-6 Correlation results between the questionnaire results and students’ personality traits

1 2 3 4 5 6 7

1.

Active Learning

Pearson

Correlation 1 - - - - - -

Sig. (2-tailed) - - - - - -

2. Game elements

Pearson

Correlation .457 1 - - - - -

Sig. (2-tailed) <.001*** - - - - -

3. Motivation

Pearson

Correlation .545 .399 1 - - - -

Sig. (2-tailed) <.001*** .001** - - - - -

4. Extraversion

Pearson

Correlation .224 .223 .336 1 - - -

Sig. (2-tailed) .080† .082† .008** - - - -

5. Agreeableness

Pearson

Correlation -.080 .137 .221 .100 1 - -

Sig. (2-tailed) .535 .288 .085† .440 - - -

6.

Conscientiousness

Pearson

Correlation .387 .242 .453 .307 .180 1 -

Sig. (2-tailed) .002** .058† <.001*** .002** .172 - -

7. Neuroticism

Pearson

Correlation -.294 -.319 -.335 -.330 -.180 -.180 1

Sig. (2-tailed) .020* .011** .008** .008** .172 .160 -

8. Openness to

experience

Pearson

Correlation -.049 .055 .146 .190 .170 .260 -.030

Sig. (2-tailed) .706 .674 .257 .130 .183 .044** .818 † p < .1 * p < .05 **p < .01 ***p < .001.

5.3.3 Discussion

This study was to investigate (1) how to develop an effective gamification system by

applying the UCD process in the development of that gamification system and (2) the role of

students’ personality traits in the effects of gamification in terms of motivation, engagement, and

learning outcomes as based on students’ performance and perspective. Overall, the results of the

data analysis demonstrate support for all hypotheses. Applying the UCD process had a positive

90

effect on building an effective gamification system based on the result from H1. However, there

was no significant difference in the number of questions between the two groups. A possible

explanation for this finding may be the fact that most modifications for main activities through

UT are associated with answering questions, not creating questions. Thus, there was no

significant improvement regarding creating questions. There were, on the other hand, significant

differences in the number of answers between the two groups in both phases, with the higher

number of answers submitted to the UCD website in both phases. Furthermore, the number of

distinct days, which represents the overall students’ engagement in the gamification system,

indicated that students in the UCD groups accessed the gamification system significantly more

frequently than students in the IV groups. Thus, we have concluded that the UCD process can be

used for developing an effective gamification system by having the users participate in the

development process.

Regarding H2, the results of the data analysis demonstrate a number of relationships

between students’ personality traits, gamification engagement, and learning outcomes. The first

relationship was identified as follows: the more extroverted students are, the higher the level of

their gamification engagement. Since extraversion is a personality trait that can be characterized

by sociability, outgoingness, assertiveness, and high amounts of emotional expressiveness, it

follows that a leaderboard, ranking, score, and badges in gamification can be motivating to a

student who is more extroverted. This result is consistent with prior research (Buckley & Doyle,

2017; Jia, Xu, Karanam, & Voida, 2016). In the second relationship identified,

conscientiousness is positively correlated to both gamification engagement and learning

outcomes. Since features of conscientiousness include high levels of thoughtfulness, with good

impulse control, goal-directed behaviors, and precision, this finding was unexpected, as it shows

the opposite results of a previous study conducted by Buckley and Doyle (2017). They suggest

the following three reasons as explanation for a negative relationship between conscientiousness

91

and gamification engagement: (1) the unstructured, chaotic, competitive nature of gamification,

(2) cognitive dissonance between play and work, and (3) no definitely correct way to solve a

problem. However, the gamification system used in the present study is a structured format with

main activities that can be solved in definitely correct ways. Furthermore, creating questions that

require higher-order cognitive skills and answering such questions are commonly considered

means of exam preparation. Thus, these features in current gamification systems enable the more

conscientious students to be more engaged as compared with other students, leading to better

learning outcomes.

In the literature, several studies demonstrated the positive relationship between

conscientiousness and learning outcomes (Komarraju et al., 2009; Noftle & Robins, 2007;

Wagerman & Funder, 2007). For example, Noftle and Robins (2007) reported that

conscientiousness is the strongest predictor of academic performance among five personal trait

items. Wagerman & Funder (2007) confirmed this finding, showing conscientiousness to be a

predictor of college performance as indexed by both freshman GPA and senior GPA. The third

identified relationship is that neuroticism negatively affects gamification engagement and

learning outcomes, respectively. Students who are high in neuroticism tend to experience

emotional instability, anxiety, moodiness, and irritability, leading to lower engagement in

gamification activity and lower learning outcomes. A number of studies have suggested that

there is a negative relationship between neuroticism and academic performance, as shown in our

study (Chamorro-Premuzic & Furnham, 2003; Petrides, Chamorro-Premuzic, Frederickson, &

Furnham, 2005; Vedel, 2014). One potential main reason is that students who are higher in

neuroticism tend to experience anxiety and stress, impairing their performance (Petrides et al.,

2005). Furthermore, the leaderboard, score, and ranking system in gamification websites can

cause quite competitive behavior, which can lead to negative emotions and lower academic

performance in students. The results of the present study regarding neuroticism and gamification

92

were consistent with previous research (Buckley & Doyle, 2017; Chamorro-Premuzic &

Furnham, 2003). The last relationship identified in this study was a positive correlation between

openness to experience and the level of gamification engagement. Since one of the

characteristics of this trait is to have a broad range of interests, it is not a surprising result. This

result is not consistent with a previous study conducted by Jia et al. (2016), but that study only

measured for individual game elements, identifying a negative relationship between openness to

experience and a single game element (avatar).

With respect to hypothesis H3, the results of the data analysis demonstrate very similar

relationships with the results from hypothesis H2. In summary, the perception of active learning

in the gamification activity was positively correlated with three personality traits including

extraversion, conscientiousness, and openness to experience. This result may offer a possible

explanation for the relationship between personality traits and gamification engagement. The

students who were high in extraversion, conscientiousness, and openness to experience

perceived gamification activities as active learning, which resulted in their greater engagement

in gamification activities. Two personality traits, extraversion and conscientiousness, correlated

positively with both the frequency with which students kept track of game elements in the

gamification system and with students’ perception of how well the gamification system

motivates them. Furthermore, there was a negative relationship between neuroticism and

students’ perception of how often they kept track of game elements in the gamification system.

This is because moderate neuroticism can be beneficial and motivating, whereas too much can

cause adverse effects. The game elements such as ranking, score, and leaderboard, which were

the highest contributing factors to quiet competition between students, may prompt negative

emotions that lead to lower performance.

93

Table 5-7 Stepwise regression analysis of gamification engagement against the Big Five factors

B SE Beta t p-value

Constant -15878.731 9127.194 -1.740 .084†

Extraversion 8852.312 1747.957 .354 5.064 .000***

Conscientiousness 4134.009 1207.881 .238 3.423 .001**

Neuroticism -4930.385 1575.257 -.211 -3.130 .002** † p < .1 * p < .05 **p < .01 ***p < .001.

In regards to the predictability of students’ engagement in gamification activity based on

their personality attributes, a stepwise multiple regression approach was conducted by treating

the Big Five factors as the predictor variables and the gamification engagement variables as

dependent variables. The result is summarized in Table 5-7 above. The findings indicated that

the combination of personality variables accounted for 28.8% of variance in gamification

engagement, R = .537, generating a statistically significant model, F (3,161) = 21.339, p < .001.

Within the model, greater levels of extraversion and conscientiousness significantly predicted

more engagement in gamification activity, whereas greater levels of neuroticism predicted lower

engagement in gamification activity.

In summary, the key result of the current study is that the effects of gamification in

terms of motivation, engagement, and learning outcome based on students’ performance and

students’ perspectives vary depending on individual attributes. In addition, we suggest that

gamification developers apply UCD in the development process in order to make gamification

more effective.

Although this study expands our knowledge of the role of personality and UCD on the

effect of gamification as a supporting tool among university students, it is still subject to some

limitations. Since we used a specific gamification system as a unitary whole, we did not cover

how particular elements such as score, badges, level, ranking, leaderboard, etc. might influence

students’ motivation and learning outcomes. Thus, further studies are needed to investigate the

impact of each game element on students’ motivation and learning outcomes. Another limitation

94

of this study is that we did not cover the effect of gamification on students’ intrinsic and

extrinsic motivation. In the literature and in this study, most gamification systems used are

reward-based, awarding students points or badges whenever they complete a predefined task.

However, some researchers have critiqued reward-based gamification, arguing that it cannot

increase intrinsic motivation and fails to change students’ behavior. Without empirical evidence,

it is still not clear what effect these mostly reward-based gamification systems have on intrinsic

motivation and how exactly they affect motivation. Thus, our future study will explore this

problem by analyzing students’ motivation based on self-determination theory and self-efficacy.

95

Chapter 6

Explore the relationship between gamification and motivation through the

lens of self-determination theory (SDT)

6.1 Introduction

This study investigated in detail the relationship between gamification and motivation

through the lens of self-determination theory (SDT). In addition, because maintaining student

motivation from the beginning to the end of the learning process is a major concern in higher

education, we determined whether gamification can maintain student motivation from the

beginning to the end of the semester. In this study, we hypothesized the following:

H1: Gamification can maintain the student’s motivation over the course of the semester.

H2: Gamification has a significantly positive relationship with autonomous motivation.

H3: Gamification has a significantly positive relationship with controlled motivation.

H4: Gamification has a significantly negative relationship with amotivation.

H5: Gamification has a significantly positive relationship with learning outcomes.

H6: Autonomous motivation has a significantly positive relationship with learning

outcomes.

H7: Controlled motivation has a significantly negative relationship with learning

outcomes.

H8: Amotivation has a significantly negative relationship with learning outcomes.

96

6.2 Method

6.2.1 Participants and procedure

The experiment was conducted in the fall semester of 2016 and 2017 with 148 students

(63 in 2016, 59 in 2017) with an average age of 20.33 (SD: .76). Data were collected from

students enrolled in the required introductory human factors course, a third-year undergraduate

IE course. The course was offered in a traditional face-to-face classroom environment. Students

who participated in this study received extra credit up to 2.5% of their final grade for each phase

in which they participated based on their performance in gamification activity as summarized in

Table 6-1. For students who did not participate in this study, we provided another extra credit

option in order to avoid equity issues.

Table 6-1 Summary of extra credit for participating in this study

Extra Credit

Sign up Extra Credit .5%

Minimum Requirement Extra Credit (3Q + 15 A) 1.0%

Additional Extra Credit only if students were ranked in the top 5% 1.0%

Participants were first introduced the purpose of this study and gamification website by

watching the instruction video in the second week (first lab) of the semester. They were asked to

complete the general knowledge test and they had practice time for gamification activities such

as authoring questions and answering questions created by their classmates. The first phase took

place from the 5th week of the semester to midterm (through week 8). During the first phase,

students could conduct the gamification activities as frequently as they wanted but were limited

to creating no more than 5 questions per day and answering no more than 15 questions per day.

The week before midterm (week 7), students were asked to complete a questionnaire on their

motivation. From the 13th week through final exams (week 16), the second phase of this study

97

was conducted using the same procedure as the first phase. The second questionnaire ware

collected on the week before final exam (week 15).

6.2.2 Measurement

6.2.2.1 Gamification

Students’ gamification activities were measured. Whenever, students create their own

questions regarding lab materials and answer the questions created by other students, they

received the points. The specific point algorithm is shown in Table 6-2. Students’ scores were

then used to determine level and ranking for competition between the students. More detailed

information on the gamification websites is available in a previous study (Kim et al., 2018).

Table 6-2 Score algorithm in gamification systems

Creating a question Points Answering a question Points

Basic score 300 Correct answer 200

Feedback score 0 or 200 Wrong answer 0

Quality score Up to 500

6.2.2.2 Questionnaire for students’ motivation.

While there are various instruments that can allow for the operationalization of intrinsic

and extrinsic motivation, Amotivation is assessed solely by the AMS (Vallerand et al., 1992).

Whereas the original AMS was designed as a global measure of academic motivation, it was

modified to the HF course context in this study. The AMS consists of seven subscales, each of

which is assessed with four items on a seven-point Likert scale on the continuum of 1 = does not

correspond at all to 7 = corresponds exactly: IM–to know, IM–toward accomplishment,

98

IM–to experience stimulation, EM–identified, EM–introjected, EM– external regulation

and Amotivation. The Original AMS showed the good reliabilities (Cronbach’s alpha ranging

from 0.63 to 0.86 for different subscales), validity (Normal Fit Index = 0.93) and repeatability

(one-month test-retest correlation was r = 0.79) (Vallerand et al., 1992). In this study, we used

the variables Autonomous Motivation (AM), Controlled Motivation (CM) and

Amotivation based on the previous studies (Grolnick & Ryan, 1987; Herath, 2015;

Vansteenkiste, Zhou, Lens, & Soenens, 2005) . AM represent a measure of the amount of self-

determined motivation meaning the motivation which came from within the student. AM was

calculated by summing up the average scores on intrinsic motivation and identified regulation

subscales of the AMS. CM was a measure of motivation which originated outside of the

individual, meaning that it was determined by external factors or reasons. CM was calculated by

summing up the average scores on introjected and external regulation subscales of the AMS.

6.2.2.3 Learning outcome

Students’ learning outcome was measured based on the grades obtained in the course.

Although Rovai et al. (2009) argued that using grades to operationalize learning may not always

provide the best results, grades give a more objective measure than self-reported measurement

and are the most prevalent measure of cognitive learning outcomes (Dumont, 1996; Hiltz

&Wellman, 1997). In this study, we used the general knowledge test score, midterm score and

final exam score in order to normalize the grade. For the first phase, the normalized grade is the

difference between general knowledge test score and midterm score and, for the second phase,

the normalized grade is the difference between general knowledge test score and final exam

score.

99

6.2.3 Data analysis

Statistical Package for Social Sciences (SPSS) and Amos were used to perform all the

statistical analyses including a paired two sample t-test, two sample t-test and structural equation

modeling. The two sample t-test was used to compare the two difference semester data in terms

of gamification activity, learning outcome. The paired two sample t-test was use to evaluate

hypothesis H1. Finally, the H2 to H8 were tested using SEM.

6.3 Results

We compared the Students’ exam scores (general knowledge, midterm and final) and

gamification activities, including the number of questions authored, the number of answers

submitted, and the number of distinct days of activity between two semesters (2016 Fall and

2017 Fall). For two semesters, all experiment setting was the same conditions only except the

instructor and students. The results are shown inTable 6-3 and Table 6-4. There is no significant

difference in students’ general knowledge score between two semesters (t (81) = 1.184, p =

.281). For student’s learning outcomes such as midterm and final exam score, there is no

significant difference between two semesters (Midterm: t (81) = -1.497, p = .146, Final t (81) = -

.776, p = .440). For gamification activities, even though there is a significantly different for

difficult of question between two semesters at second phase (t (96) = -4.90, p = <.01), overall the

main activity such as the number of question and answer were not significant difference between

two semesters. Thus, we combined all data from two semesters for further hypothesis testing.

100

Table 6-3 Students’ exam scores of two semester

Grade Semester N Mean(SD) P Value

general knowledge test score 2016 24 34.58(11.41)

.281 2017 59 31.69 (9.50)

Midterm exam score 2016 24 79.08 (10.89)

.146 2017 59 82.59 (5.72)

Final exam score 2016 39 82.72(7.92)

.440 2017 59 87.33 (5.69)

Table 6-4 The summary of gamification activities between two groups for both phases

Activity Year 1st Phase 2nd Phase

N Mean(SD) P Value d N Mean(SD) P Value d

Number of

Questions

2016 24 7.46(6.58) .385 .25

39 7.56(7.25) .582 .11

2017 59 6.24(2.44) 59 8.25(5.11)

Quality of

Questions

2016 24 3.00(1.06) .337 .26

39 3.15(0.35) .076† .36

2017 59 3.22(0.56) 59 3.01(0.41)

Difficulty of

Questions

2016 24 1.64(0.53) .217 .34

39 1.51(0.27) < 0.01** 1.02

2017 59 1.79(0.32) 59 1.80(0.30)

Number of

Followers

2016 24 2.04 (3.25) .542 .14

39 3.82(6.00) .117 .35

2017 59 2.46(2.61) 59 2.20(2.43)

Number of

Answers

2016 24 36.96 (23.83) .170 .30

39 54.21(43.97) .848 .04

2017 59 46.47 (37.11) 59 56.15(52.23)

Number of

Comments

2016 24 10.42(12.49) .528 .14

39 7.56(8.82) .459 .16

2017 59 8.90(8.66) 59 6.41(4.95)

Number of Distinct

Days

2016 24 17.63(4.14) .074† .38

39 20.18(4.58) .495 .15

2017 59 15.31(7.41) 59 21.02(6.67)

Number of Badges 2016 24 8.75(3.55)

.458 .18 39 10.18(3.53)

.348 .2 2017 59 8.07(3.86) 59 9.36(4.63)

† p < .1 * p < .05 **p < .01 ***p < .001.

H1 Gamification can maintain the student’s motivation over the course of the semester.

Paired t- test was conducted on all of the motivational measures. Table 6-5 shows the

mean and standard deviation for IM, CM and AM. In general, students’ levels of motivation did

not decrease over time. To be specific, there was significantly increase in students’ IM and no

difference in students’ CM. However, students’ AM significantly declined over the course of the

semester.

101

Table 6-5 Mean and standard deviation for AM, CM and amotivation

Motivation Time 1 Time 2

Mean SD Mean SD

AM 4.81 0.89 4.98 1.07

CM 5.20 0.71 5.31 0.83

Amotivation 3.27 1.51 2.94 1.50

6.3.1 Structured model evaluation

To test from H2 to H8, Hypothesis testing was conducted using structural equation

modeling (SEM). The reliabilities of dimensions in this study (Table 6-6) ranged from 0.75 to

0.933, which is higher than the 0.7 threshold for each dimension of Cronbach’s alpha. The

average variance extracted (AVE) ranged between 0.61 and 0.7 being larger than 0.5. This study,

therefore, satisfies the reliability and validity conditions.

Table 6-6 Reliability testing

Variables Cronbach’s alpha

IM to know 0.767

IM accomplishment 0.769

IM stimulation 0.768

EM-identified regulation 0.750

EM-introjected regulation 0.752

EM-external regulation 0.773

Amotivation 0.933

AM 0.751

CM 0.753

IM intrinsic motivation, EM extrinsic motivation AM autonomous motivation, CM controlled motivation

The correlations between the different variables were as follows (see Table 6-7):

Amotivation was significantly negatively correlated with all other variables. AM and CM were

significantly positively correlated which was expected as it had been observed in earlier studies

(Vansteenkiste et al. 2005). For learning outcome and gamification were significantly positively

correlated with AM and CM.

102

Table 6-7 The correlations between the different variables

Amotivation AM CM Learning Outcome

AM -0.439*** CM -0.201** 0.614***

Learning Outcome -0.326*** 0.439*** 0.238** Gamification -0.317*** 0.438*** 0.252** 0.57***

AM autonomous motivation, CM controlled motivation † p < .1 * p < .05 **p < .01 ***p < .001.

The result of structural equation model analyses is depicted in Figure 6-1 and an

acceptable fit to the data: 𝜒2(df = 18, N = 166) = 25.738, p = .106, GFI =.971, RMSEA =.049,

CFI = .992, AGFI = .926 and SRMR = 0.042. However, both the path from CM and

Amotivation to learning outcome were not significant. The estimated model appears in Fig. 2

with path coefficients included. The total variance R2 values for AM, CM, Amotivation and

learning outcome were 21.8, 7.5, 10, and 39% of the variance, respectively. Based on the

structural model analysis, the results showed that gamification activity had a significant positive

influence on AM (β =.467, p<0.001) and CM (β = .273, p<0.001) and negative influence on

Amotivation (β = -.317, p<0.001). It also found that gamification activity had a significant

positive influence on learning outcome (β = .431, p<0.001). Finally, AM had a significant

positive influence on learning outcome (β = .295, p<0.01).

103

Figure 6-1 Structural equation model depicting relationship between gamification, motivation,

and performance

6.4 Discussion

This study was designed to test (1) a hypothesis that gamification can maintain student

motivation over the course of the semester and (2) a hypothesized model in which gamification

would positively affect the intrinsic and extrinsic motivation, but negatively affect the

amotivation, and would in turn positively affect academic performance. Overall, the results of

the data analysis demonstrate support for all hypotheses.

Regarding H1, in literature, the previous studies showed that students’ levels of

motivation decreased over time (Brouse, Basch, Leblanc, McKnight, & Lei, 2010; Nilsson &

Warrén Stomberg, 2008; Zusho et al., 2003). For example, Zusho et al. (2003) assessed at three

time points over a semester for the students motivation with 458 students enrolled in

introductory college chemistry classes. They found a declined in students’ motivation including

104

self-efficacy, task value and goal of the performance. Nilsson & Warrén Stomberg (2008) also

found same trend in student’s decreased motivation for nursing students in Sweden. However,

our result showed that the students motivations such as intrinsic and extrinsic motivation did not

decrease over the time, which implies that gamification play a role in medicate factors to

maintain the student’s motivation during the semester. Furthermore, students’ amotivation was

significantly decreased over time. Theoretically, those with high levels of amotivation would be

more likely to engage in negative behaviors as they would be more likely to be disengaged and

unattached to learning (Larson, 2000). Thus, I suggest that Applying gamification as a

supporting tool in learning environment may be the answer to change student’s behavior by

change amotivation to extrinsic or intrinsic motivation.

Regarding from H2 to H8, I found that the gamification engagement is positively

associated with autonomous and controlled motivations as well as student’s performance, but it

is negatively associated with amotivation. It is expected that gamification engagement is

positively related to controlled motivation since our gamification system used are reward-based,

awarding students points or badges whenever they complete a predefined task. For example,

whenever students create their own question or answering the question created by other

classmates, they received the point which used for the calculation of ranking. The previous

studies on gamification in education showed the same trend with our results (Denny, 2013;

Goehle, 2013; Hamari, 2017; Li, Grossman, & Fitzmaurice, 2012). For the relationship between

gamification engagement and autonomous motivation, it is an unexpected result because our

gamification is reward based system exposure to critiquing as diminishing students’ intrinsic

motivation based on the CET. One possible explanation is that even though students started to

participate in gamification activity due to the reward which is related controlled motivation, the

students’ extrinsic motivation were converted into intrinsic motivation by feeling enjoyment

from some game element such as badge, point and feedback during the gamification activity.

105

This results already found in the result from previous experiment. Thus, I expect that reward

base gamification can promote not only to increase student’s extrinsic, but also to increase

student’s intrinsic motivation resulted in change the student’s behavior. Autonomous motivation

also had a significant impact on students learning effect. This result echo previous researches

which found that the intrinsic motivation is positively related to students’ academic achievement

(Guay, Ratelle, & Chanal, 2008; Herath, 2015; Lin, McKeachie, & Kim, 2001). This is because

if the students feel competent when learning, they will experience an increase in autonomous

academic motivation which will, in turn, make them achieve higher scores on their exam.

However, there is a negative relationship between controlled motivation and learning outcomes

as shown in the previous research (Vansteenkiste et al., 2010). Finally, I support that

gamification engagement is positively associated with students learning outcomes. Even though

there were mixed results about the effect of gamification to learning outcomes in the literature, I

provide the empirical evidence regarding positive relationship between gamification engagement

and students learning outcomes.

In summary, key result of the current study is that gamification can be used as a

supporting tool in education to make students motivation last over time. Furthermore, I

identified the empirical evidence regarding that even reward base gamification can increase

student’s intrinsic motivation which result in making possible to change students behavior.

106

Chapter 7

Conclusion

The main objective of an educational curriculum is not only to convey certain

knowledge and skills to students but also to provide them with an understanding of real

problems and how to solve them. The design of such a curriculum should meet all learning

objectives while ensuring that students are motivated and engaged (Wood & Reiners, 2012). To

explore this objective, active learning, also known as learning-enhancing pedagogy, has received

considerable attention from engineering educators over the past several decades (Deslauriers et

al., 2011; Johnson et al., 1998; National Academy of Engineers, 2005; Prince, 2004). One

approach to achieving active learning is the application of gamification to an educational context

with the aim of creating a motivational atmosphere through constant feedback, mini challenges,

and positive reinforcement (Sinha, 2012; Wood & Reiners, 2012)

The application of gamification to education can be customized, especially in the case of

direct interaction among students and the instructor, thereby improving the engagement of

students through the use of game elements such as scores, levels, badges, avatars, and

leaderboards (Flatla, Gutwin, Nacke, Bateman, & Mandryk, 2011). Several researchers have

conducted empirical studies in an education setting to determine the effect of gamification on

students’ learning. However, outcomes from gamification studies in an education setting are not

consistent. Some studies show positive or partially positive effects of gamification on student

learning (Akpolat & Slany, 2014; Denny, 2013; Domínguez Saenz-de-Navarrete, J., de-Marcos,

L., Fernández-Sanz, L., Pagés, C., & Martínez-Herráiz, 2013; Kim et al., 2016). But other

studies indicate no differences in performance between gamification groups and non-

gamification groups. In addition, little research has been conducted on the application of

gamification to engineering lab activities. Furthermore, most traditional gamification studies

107

have ignored important variables like the individual needs and personalities of students.

Therefore, this study investigates the application of User-Centered Design to the development of

gamification in order to increase the effects of gamification students performing engineering lab

activities through four different studies: (1) comparing students’ motivation, engagement, and

learning outcomes between gamification and non-gamification system; (2) determining the

effect of the application of User-Centered Design on the development of an effective

gamification system; (3)determining the role of students’ personality traits in the effects of

gamification in terms of motivation, engagement, and learning outcomes; and (4) determining

the relationship among gamification, each type of motivation (intrinsic, extrinsic and

amotivation), and learning outcomes.

The results of the first study showed (1) a positive relationship among a positive

perception of active learning in the gamification system, the level of gamification engagement,

and the learning outcome; (2) a positive relationship among students’ self-perceived motivation,

gamification engagement, and learning outcomes; (3) the positive effects of game elements in a

gamification system on gamification engagement and learning outcome. I also found that game

elements such as Ranking, Score, or Level best motivated and facilitated the enjoyment of

students. In summary, I found that the application of gamification as a supporting tool in

engineering lab activities has a positive effect on students’ motivation, engagement, and learning

outcome based on the relationship between students’ performance and students’ perspective.

The second study found a total of 25 unique usability problems distributed among the

following 5 categories: (1) Design, (2) Navigation, (3) Game Element, (4) Main Activity, and

(5) Feedback. I also found higher student engagement with and preference for the UCD

gamification version than the initial gamification version. Thus, I conclude that the UCD process

can be used for developing an effective gamification system by having the users participate in

the development process.

108

The third study was to investigate whether different personality traits among students

will lead to different levels of gamification engagement and different learning outcomes. Our

finding is that (1) the more extroverted students are, the higher the level of their gamification

engagement; (2) conscientiousness is positively correlated to both gamification engagement and

learning outcomes; and (3) neuroticism negatively affects both gamification engagement and

learning outcomes. Based on our result, I can conclude that the effects of gamification in terms

of motivation, engagement, and learning outcome vary depending on individual attributes.

The last study investigates whether gamification can maintain student motivation over

the course of an entire semester. I used structural equation modeling to investigate deeply the

relationship between gamification and student motivation within the framework of self-

determination theory. The results showed that students’ motivations, intrinsic and extrinsic, did

not decrease over time, while students’ amotivation significantly decreased over time, which

implies that gamification plays a role as a mediating factor in maintaining student motivation

over the course of a semester. I identified empirical evidence that suggests that even reward-base

gamification can increase students’ intrinsic and extrinsic motivation, making it possible to

change students’ behavior as well as their learning outcomes.

The present study is one of the first to cover several aspects still underexplored in

current gamification research. I attempted to empirically evaluate the impact of applying UCD to

gamification on student motivation within an SDT framework, which is seldom empirically

studied in gamification literature. This study is also one of the first to empirically find that even

reward-based gamification can increase students’ intrinsic motivation, making it possible to

change students’ behavior. However, since I did not figure out the effects of individual game

elements in students’ motivation, more empirical research is necessary to determine why

particular game elements act as extrinsic or intrinsic motivators in a given context and how this

in turn shapes students’ behavior. I believe our study is a valuable first step in this direction and

109

may serve as a blueprint for future studies. I expect that these results will inform instructors who

are interested in gamifying their courses and will help them in deciding how to develop

gamification to use in their specific context.

110

Appendix A

The first questionnaire for gamification group

Question for Background

Do you regularly play games?

Yes No

I am familiar with Game elements such as level, badge, leader-board, and ranking

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Question for Gamification Systems

Q1. Did you actively try to earn score (points)?

Never Rarely Sometimes Most of the Time Always

Q2. Did you actively try to earn badges?

Never Rarely Sometimes Most of the Time Always

Q3. Did you actively try to earn high overall rating for your question from classmates?

Never Rarely Sometimes Most of the Time Always

Q4. Did you keep track of your level?

Never Rarely Sometimes Most of the Time Always

Q5. Did you keep track of your ranking?

Never Rarely Sometimes Most of the Time Always

Q6. Did you keep track of the number of followers?

Never Rarely Sometimes Most of the Time Always

Q7. Did you keep track of feedback from your class mates?

Never Rarely Sometimes Most of the Time Always

Do you agree or disagree with the following statement:

Q8. These game elements increased my enjoyment of doing activities.

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

111

Q9. These game elements motivated me to participate more than I would have otherwise.

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Q10. I think that my level of involvement was high.

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Q11. Creating my own questions was an effective way of learning in this class

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Q12. Answering questions that were created by classmates was an effective way of

learning in this class

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Q13. These activities (creating and answering questions) was helpful for preparing exam.

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Q14. This website was helpful for preparing exam.

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Q15. There was a sufficient number of questions to learn the lab material.

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

Q16. I have sufficient time to complete thess activities.

Strongly

Disagree Disagree

Neither Agree nor

Disagree Agree

Strongly

Agree

112

Appendix B

The second questionnaire for gamification group

Question for Gamification elements

Which game elements (Score, Badge, Level, Ranking, Feedback, Avatar, Notification, etc.) do you

think motivated you to participate more than others?

Which game elements (Score, Badge, Level, Ranking, Feedback, Avatar, Notification, etc.) do you

think increased your enjoyment of using Website?

Which game elements (Score, Badge, Level, Ranking, Feedback, Avatar, Notification, etc.) should be

retained in this website?

Which game elements (Score, Badge, Level, Ranking, Feedback, Avatar, Notification, etc.) should

be removed in this website?

Suggestions

What suggestions do you have for improving this activity or website

113

Appendix C

The questionnaire for non-gamification group

Do you agree or disagree with the following statement:

Q1. Creating my own questions was an effective way of learning in this class

Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree

Q2. Answering questions that were created by classmates was an effective way of learning in this class

Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree

Q3. These activities (creating and answering questions) was helpful for preparing exam.

Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree

Q4. This website was helpful for preparing exam.

Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree

Q5. There was a sufficient number of questions to learn the lab material.

Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree

Q6. I have sufficient time to complete this activities.

Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree

114

References

Akpolat, B. S., & Slany, W. (2014). Enhancing software engineering student team engagement

in a high-intensity extreme programming course using gamification. In 2014 IEEE 27th

Conference on Software Engineering Education and Training, CSEE and T 2014 -

Proceedings (pp. 149–153). https://doi.org/10.1109/CSEET.2014.6816792

Alben, L. (1996). Defining the criteria for effective interaction design. Interactions, 3(3), 11–15.

Allen, D., & Tanner, K. (2005). Infusing Active Learning into the Large-enrollment Biology

Class: Seven Strategies, from the Simple to Complex. Cell Biology Education, 4(4), 262–

268. https://doi.org/10.1187/cbe.05-08-0113

Altbach, P. G., & Knight, J. (2007). The Internationalization of higher education: Motivations

and realities. Journal of Studies in International Education, 11(3–4), 290–305.

https://doi.org/10.1177/1028315307303542

Amabile, T. M., DeJong, W., & Lepper, M. R. (1976). Effects of Externally-Imposed Deadlines

on Subsequent Intrinsic Motivation. Journal of Personality and Social Psychology, 34(1),

92–98. Retrieved from http://psycnet.apa.org/journals/psp/34/1/92/

Anderson, L. W., Krathwohl, D. R., Airasian, P., Cruikshank, K., Mayer, R., Pintrich, P., …

Wittrock, M. (2001). A taxonomy for learning, teaching and assessing: A revision of

Bloom’s taxonomy. New York. Longman Publishing. Artz, AF, & Armour-Thomas,

E.(1992). Development of a Cognitive-Metacognitive Framework for Protocol Analysis of

Mathematical Problem Solving in Small Groups. Cognition and Instruction, 9(2), 137–175.

Antin, J., & Churchill, E. E. (2011). Badges in Social Media : A Social Psychological

Perspective. In CHI 2011 (Vol. Human Fact, pp. 1–4). Retrieved from

http://uxscientist.com/public/docs/uxsci_2.pdf%5Cnhttp://uxscientist.com/?sort=post_date

&page=7

Aranyi, G., Van Schaik, P., & Barker, P. (2012). Using think-aloud and psychometrics to

explore users’ experience with a news Web site. Interacting with Computers, 24(2), 69–77.

https://doi.org/10.1016/j.intcom.2012.01.001

Armstrong, D. A. (2011). Students’ perceptions of online learning and instructional tools: A

qualitative study of undergraduate students use of online tools. TOJET: The Turkish Online

Journal of Educational Technology, 10(3).

Awwad, F., & Ayesh, A. (2013). Effectiveness of Laptop Usage in Uae University

Undergraduate Teaching. TOJET: The Turkish Online Journal of Educational Technology,

12(2). Retrieved from http://files.eric.ed.gov/fulltext/EJ1015528.pdf

Ayub, N. (2010). Effect of Intrinsic and Extrinsic Motivation on Academic Performance.

Pakistan Business Review, (November), 363–372.

Barak, M., Lipson, A., & Lerman, S. (2006). Wireless Laptops as Means For Promoting Active

Learning In Large Lecture Halls. Journal of Reserach on Technology in Education,

115

5191(November), 245–263. https://doi.org/10.1080/15391523.2006.10782459

Barata, G., Gama, S., Jorge, J., & Gonçalves, D. (2013). Improving participation and learning

with gamification. In Proceedings of the First International Conference on Gameful

Design, Research, and Applications - Gamification ’13 (pp. 10–17).

https://doi.org/10.1145/2583008.2583010

Barzilai, S., & Zohar, A. (2012). Epistemic thinking in action: Evaluating and integrating online

sources. Cognition and Instruction, 30(1), 39–85.

https://doi.org/10.1080/07370008.2011.636495

Bayles, M. E. (2002). Designing online banner advertisements: Should we animate? In

Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp.

363–366). ACM.

Bielik, P. (2012). Integration and adaptation of motivational factors into software systems. In

Personalized Web - Sciene, Technologies and Engineering: 11th Spring 2012 PeWe

Workshop (pp. 31–32). Retrieved from

https://wiki.fiit.stuba.sk/research/seminars/pewe/ontoparty-2011-2012-

spring/abstracts/31_bielik.pdf

Bodnar, C. A., & Clark, R. M. (2014). Exploring the Impact Game-based Learning Has on

Classroom Environment and Student Engagement Within an Engineering Product Design

Class. In Proceedings of the Second International Conference on Technological

Ecosystems for Enhancing Multiculturality (pp. 191–196). ACM.

https://doi.org/10.1145/2669711.2669899

Bogost, I. (2011). Gamification is bullshit. The Gameful World: Approaches, Issues,

Applications. Retrieved from

https://books.google.com/books?hl=en&lr=&id=KDxTBgAAQBAJ&oi=fnd&pg=PA65&d

q=Gamification+is+bullshit&ots=hJ97TV6LJR&sig=8oGAuYU6b14ayfhzMNvB0XNAU

8I

Boren, M. T., & Ramey, J. (2000). Thinking aloud: Reconciling theory and practice. IEEE

Transactions on Professional Communication, 43(3), 261–278.

https://doi.org/10.1109/47.867942

Brooks, D. C. (2016). ECAR Study of Undergraduate Students and Information Technology.

Retrieved from

https://library.educause.edu/resources/2016/6/~/media/files/library/2016/10/ers1605.pdf

Brouse, C. H., Basch, C. E., Leblanc, M., McKnight, K. R., & Lei, T. (2010). College students’

academic motivation: Differences by gender, class, and source of payment. College

Quarterly, 13(1), 1–10. Retrieved from https://eric.ed.gov/?id=EJ912093

Brown, S., & Walter, M. (2004). The Art of Problem Posing.

https://doi.org/10.4324/9781410611833

Buckley, P., & Doyle, E. (2017). Individualising gamification: An investigation of the impact of

learning styles and personality traits on the efficacy of gamification using a prediction

116

market. Computers and Education, 106, 43–55.

https://doi.org/10.1016/j.compedu.2016.11.009

Busato, V. V, Prins, F. J., Elshout, J. J., & Hamaker, C. (1998). The relation between learning

styles, the Big Five personality traits and achievement motivation in higher education.

Personality and Individual Differences, 26(1), 129–140. https://doi.org/10.1016/S0191-

8869(98)00112-3

Chamorro-Premuzic, T., & Furnham, A. (2003). Personality predicts academic performance:

Evidence from two longitudinal samples. Journal of Research in Personality, 37, 319−338.

Retrieved from http://www.sciencedirect.com/science/article/pii/S0092656602005780

Chang, K. E., Wu, L. J., Weng, S. E., & Sung, Y. T. (2012). Embedding game-based problem-

solving phase into problem-posing system for mathematics learning. Computers and

Education, 58(2), 775–786. https://doi.org/10.1016/j.compedu.2011.10.002

Chin, C., & Brown, D. E. (2002). Student-generated questions: A meaningful aspect of learning

in science. International Journal of Science Education, 24(5), 521–549.

https://doi.org/10.1080/09500690110095249

Clark, M. H., & Schroth, C. A. (2010). Examining relationships between academic motivation

and personality among college students. Learning and Individual Differences, 20(1), 19–

24. https://doi.org/10.1016/j.lindif.2009.10.002

Cockton, G., Woolrych, A., Lavery, D., Sears, A., & Jacko, J. (2009). Inspection-based

evaluations. Retrieved from https://books.google.com/books?hl=en&lr=&id=clMsHX-

JfyMC&oi=fnd&pg=PA273&dq=Inspection-

based+evaluation.&ots=7s7M8ncDGq&sig=ZxRK57wBUO3oSVwjwxgngwmjwxw

Cohen, A., & Baruth, O. (2017). Personality, Learning, and Satisfaction in Fully Online

Academic Courses. Computers in Human Behavior, 72, 1–12.

https://doi.org/10.1016/j.chb.2017.02.030

Damico, J., & Baildon, M. (2007). Examining ways readers engage with websites during think-

aloud sessions. Journal of Adolescent & Adult Literacy, 51(3), 254–263.

https://doi.org/doi:10.1598/JAAL.51.3.5

de Freitas, A. A., & de Freitas, M. M. (2013). Classroom Live: a software-assisted gamification

tool. Computer Science Education, 23(2), 186–206.

https://doi.org/10.1080/08993408.2013.780449

de Sousa Borges, S., Durelli, V. H. S., Macedo Reis, H., & Isotani, S. (2014). A systematic

mapping on gamification applied to education. Proceedings of the 29th Annual ACM

Symposium on Applied Computing - SAC ’14, (Icmc), 216–222.

https://doi.org/10.1145/2554850.2554956

Deci, E.L., Koestner, R., & Ryan, R. M. (1999). A meta-analytic review of experiments

examining the effects of extrinsic rewards on intrisic motivation. Psychological Bulletin,

125(6), 627–668. https://doi.org/10.1007/s13398-014-0173-7.2

117

Deci, E. L. (1971). Effects of externally mediated rewards on intrinsic motivation. Journal of

Personality and Social Psychology, 18(1), 105–115.

Deci, E. L., Betley, G., Kahle, J., Abrams, L., & Porac, J. (1981). When trying to win:

Competition and intrinsic motivation. Personality and Social Psychology Bulletin, 7(1),

79–83. https://doi.org/10.1177/014616728171012

Deci, E. L., & Ryan, R. (1990). A motivational approach to self: integration in personality.

Nebraska Symposium On Motivation. https://doi.org/10.1207/s15326985ep2603&4_6

Deci, E. L., & Ryan, R. M. (1985). Cognitive Evaluation Theory. In Intrinsic Motivation and

Self-Determination in Human Behavior (pp. 43–85). Boston, MA: Springer US.

https://doi.org/10.1007/978-1-4899-2271-7_3

Deci, E. L., & Ryan, R. M. (2008). Self-determination theory: A macrotheory of human

motivation, development, and health. In Canadian Psychology (Vol. 49, pp. 182–185).

https://doi.org/10.1037/a0012801

Deci, E. L., Vallerand, R. J., Pelletier, L. G., Ryan, R. M., Deci, E. L., Vallerand, R. J., … Ryan,

R. M. (1991). Motivation and Education : The Self-Determination Perspective. Educational

Psychologist1, 26(3 & 4), 325–346. https://doi.org/10.1080/00461520.1991.9653137

Denny, P. (2013). The effect of virtual achievements on student engagement. In Proceedings of

the SIGCHI Conference on Human Factors in Computing Systems - CHI ’13 (p. 763).

https://doi.org/10.1145/2470654.2470763

Denny, P., Luxton-Reilly, A., & Hamer, J. (2008). Student use of the PeerWise system. ACM

SIGCSE Bulletin, 40(3), 73. https://doi.org/10.1145/1384271.1384293

Deslauriers, L., Schelew, E., & Wieman, C. (2011). Improved learning in a large-enrollment

physics class. Science (New York, N.Y.), 332(6031), 862–4.

https://doi.org/10.1126/science.1201783

Dicheva, D., Dichev, C., Agre, G., & Angelova, G. (2015). Gamification in education: A

systematic mapping study. Educational Technology and Society, 18(3), 75–88.

https://doi.org/10.1109/EDUCON.2014.6826129.

Ding, D., Guan, C., & Yu, Y. (2017). Game-Based Learning in Tertiary Education: A New

Learning Experience for the Generation Z. International Journal of Information and

Education Technology, 7(2), 148–152. https://doi.org/10.18178/ijiet.2017.7.2.857

Domínguez Saenz-de-Navarrete, J., de-Marcos, L., Fernández-Sanz, L., Pagés, C., A., &

Martínez-Herráiz, J. J. (2013). Gamifying learning experiences: Practical implications and

\routcomes. Computers & Education, 63, 380–392. Retrieved from

https://portal.uah.es/portal/page/portal/epd2_profesores/prof23288/publicaciones/GamifLe

arningExperiences_pre-review_v3.1PreprintFinal.pdf

Dong, T., Dontcheva, M., Joseph, D., Karahalios, K., Newman, M., & Ackerman, M. (2012).

Discovery-based games for learning software. In Proceedings of the 2012 ACM annual

conference on Human Factors in Computing Systems - CHI ’12 (p. 2083).

118

https://doi.org/10.1145/2207676.2208358

Drake, J., & Barlow, A. (2008). Assessing Students’ Levels of Understanding Multiplication

through Problem Writing. Teaching Children Mathematics. Retrieved from

http://eric.ed.gov/?id=EJ781359

Drucker, S. M., Glatzer, A., De Mar, S., & Wong, C. (2002). SmartSkip: consumer level

browsing and skipping of digital video content. In Proceedings of the SIGCHI conference

on Human factors in computing systems (pp. 219–226). ACM.

Dumas, J. S., & Redish, J. (1999). A practical guide to usability testing. Intellect books.

Dumas, S. T., Cutrell, E., & Chen, H. (n.d.). Bringing Order to the Web: Optimizing Search by

Showing Results in Context. In Proceedings of the CHI’2001 Conference on Human

Factors in Computing Systems (pp. 277–283).

Eccles, J. S., Wigfield, A., & Schiefele, U. (1998). Motivation to succeed.

Flatla, D. R., Gutwin, C., Nacke, L. E., Bateman, S., & Mandryk, R. L. (2011). Calibration

games: making calibration tasks enjoyable by adding motivating game elements.

Proceedings of the 24th Annual ACM Symposium on User Interface Software and

Technology - UIST ’11, 403–412. https://doi.org/10.1145/2047196.2047248

Foos, P. W. (1989). Effects of Student-Written Questions on Student Test Performance.

Teaching of Psychology, 16(2), 77–78. https://doi.org/10.1207/s15328023top1602_10

Fortier, M. S., Vallerand, R. J., & Guay, F. (1995). Academic motivation and school

performance: Toward a structural model. Contemporary Educational Psychology.

https://doi.org/10.1006/ceps.1995.1017

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., &

Wenderoth, M. P. (2014). Active learning increases student performance in science,

engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23),

8410–8415. https://doi.org/10.1073/pnas.1319030111

Fried, C. B. (2008). In-class laptop use and its eVects on student learning. Computers &

Education, 50, 906–914. https://doi.org/10.1016/j.compedu.2006.09.006

Frunham, A. (1996). The FIRO-B, the Learning Style Questionnaire, and the Five-Factor Model

Furnham. A Journal of Social Behavior and Personality, 11(2), 285–299. Retrieved from

http://search.proquest.com.ezaccess.libraries.psu.edu/docview/1292266187/fulltextPDF/BE

E4A6B7403E44AAPQ/1?accountid=13158

Gandell, T., Weston, C., Finkelstein, A., & Winer, L. (2000). Appropriate use of the web in

teaching higher education. In Perspectives in web course management (pp. 61–68).

Retrieved from

https://scholar.google.com/scholar?hl=en&as_sdt=0,39&q=appropriate+use+of+the+web+i

n+teaching+in+higher+education+in+B+Mann

Gaudreau, P., Miranda, D., & Gareau, A. (2014). Computers & Education Canadian university

119

students in wireless classrooms : What do they do on their laptops and does it really

matter ? q. Computers & Education, 70, 245–255.

https://doi.org/10.1016/j.compedu.2013.08.019

Glover, I. (2013). Play as you learn : gamification as a technique for motivating learners.

Proceedings of World Conference on Educational Multimedia, Hypermedia and

Telemcommunications, 1998–2008. Retrieved from http://shura.shu.ac.uk/7172/

Goehle, G. (2013). Gamification and Web-based Homework. Primus, 23(3), 234–246.

https://doi.org/10.1080/10511970.2012.736451

Goldberg, L. (1992). Goldberg.Big-Five-Markers-Psych.Assess.1992. Psychological

Assessment. Retrieved from http://psycnet.apa.org/journals/pas/4/1/26/

Golightly, D., Hone, K. S., & Ritter, F. E. (1999). Speech Interaction Can Support Problem

Solving. In INTERACT (pp. 149–155). Citeseer.

Grail Research. (2011). Consumers of Tomorrow Insights and Observations About Generation

A. Retrieved from

http://www.integreon.com/pdf/Blog/Consumers_of_Tomorrow_Insights_and_Observations

_About_Generation_Z_246.pdf

Gray, M., & Wardle, H. (2013). Observing gambling behaviour using think aloud and video

technology: a methodological review. NatCen Social Research. Available at: Www.

Natcen. Ac. Uk.

Grolnick, W. S., & Ryan, R. M. (1987). Autonomy in children’s learning: An experimental and

individual difference investigation. Journal of Personality and Social Psychology, 52(5),

890.

Guan, Z., Lee, S., Cuddihy, E., & Ramey, J. (2006). Guan_Lee_Validity_of_think_aloud.

Proceedings of the SIGCHI, 1253–1262. Retrieved from

http://cmapspublic.ihmc.us/rid=1H8MBVM94-HBC556-3VVM/Ramey_RTA.pdf

Guay, F., Ratelle, C. F., & Chanal, J. (2008). Optimal learning in optimal contexts: The role of

self-determination in education. Canadian Psychology/Psychologie Canadienne, 49(3),

233.

Gulliksen, J., Göransson, B., Boivie, I., Blomkvist, S., Persson, J., & Cajander, Å . (2003). Key

principles for user-centred systems design. Behaviour and Information Technology, 22(6),

397–409.

Gutwin, C. (2002). Improving focus targeting in interactive fisheye views. In Proceedings of the

SIGCHI conference on Human factors in computing systems (pp. 267–274). ACM.

Haaranen, L., Ihantola, P., Hakulinen, L., & Korhonen, A. (2014). How (not) to introduce

badges to online exercises. Proceedings of the 45th ACM Technical Symposium on

Computer Science Education - SIGCSE ’14, 33–38.

https://doi.org/10.1145/2538862.2538921

120

Hamari, J. (2017). Do badges increase user activity? A field experiment on the effects of

gamification. Computers in Human Behavior, 71, 469–478.

https://doi.org/10.1016/j.chb.2015.03.036

Hamari, J., Koivisto, J., & Sarsa, H. (2014). Does gamification work?--a literature review of

empirical studies on gamification. In System Sciences (HICSS), 2014 47th Hawaii

International Conference on (pp. 3025–3034). IEEE.

https://doi.org/10.1109/HICSS.2014.377

Hanus, M. D., & Fox, J. (2015). Assessing the effects of gami fi cation in the classroom : A

longitudinal study on intrinsic motivation , social comparison , satisfaction , effort , and

academic performance. Computers & Education, 80, 152–161.

https://doi.org/10.1016/j.compedu.2014.08.019

Haron, H., & Suriyani Sahar. (2010). An investigation on predictors of E-learning adoption

among Malaysian E-learners. CSSR 2010 - 2010 International Conference on Science and

Social Research, (Cssr), 927–932. https://doi.org/10.1109/CSSR.2010.5773921

Hembrooke, H., & Gay, G. (2003). The laptop and the lecture: The effects of multitasking in

learning environments. Journal of Computing in Higher Education, 15(1), 46–64.

https://doi.org/10.1007/BF02940852

Herath, T. C. (2015). Student Learning and Performance in Information Systems Courses: The

Role of Academic Motivation. Decision Sciences Journal of Innovative Education, 13(4),

583–601.

Hogan, R., Hogan, J., & Roberts, B. W. (1996). Personality measurement and employment

decisions. American Psychologist, 51(5), 469–477. https://doi.org/10.1037/0003-

066X.51.5.469

Hooker, M. (1997). The Transformation of Higher Education in: D. Oblinger and S. Rush (Eds).

The Learning Revolution: The Challenge of Information Technology in the. Retrieved

from

https://scholar.google.com/scholar?q=The+transformation+of+higher+education+in+D+Ob

linger+and+S+Rush&btnG=&hl=en&as_sdt=0%2C39

Igel, C., & Urquhart, V. (2012). Generation Z, meet cooperative learning: Properly implemented

cooperative learning strategies can increase student engagement and achievement. Middle

School Journal, 43(4), 16–21. Retrieved from

http://www.tandfonline.com/doi/pdf/10.1080/00940771.2012.11461816

Iosup, A., & Epema, D. (2014). An experience report on using gamification in technical higher

education. In Proceedings of the 45th ACM technical symposium on Computer science

education - SIGCSE ’14 (pp. 27–32). https://doi.org/10.1145/2538862.2538899

ISO. (2009). Ergonomics of human system interaction – Part 210: Human-centred design for

interactive systems (ISO 9241-210:2010; formerly known as 13407). International

Organization for Standardization.

Jeffrey, R., & Chisnell, D. (1994). Handbook of usability testing: how to plan, design, and

121

conduct effective tests. New York John Wiley Sons.

Jia, Y., Xu, B., Karanam, Y., & Voida, S. (2016). Personality-targeted Gamification.

Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI

’16, 2001–2013. https://doi.org/10.1145/2858036.2858515

Johnson, D. W., Johnson, R. T., & Smith, K. A. (Karl A. (1998). Active learning : cooperation

in the college classroom. Interaction Book Co. Retrieved from

https://eric.ed.gov/?id=ED449714

Kapp, K. (2014). Gamification: Separating Fact From Fiction. Chief Learning Officer,

13(3)(March), 42–46. https://doi.org/10.2304/elea.2005.2.1.5

Kay, R. H., & Lauricella, T. S. (2011). Exploring the Benefits and Challenges of Using Laptop

Computers in Higher Education Classrooms : A Formative Analysis Exploration des

avantages et des défis relatifs à l ’ utilisation des ordinateurs portables dans les salles de co.

Canadian Journal of Learning and Technology/La Revue Canadienne de l’apprentissage et

de La Technologie, 37(1), 1–18. Retrieved from

https://www.cjlt.ca/index.php/cjlt/article/viewFile/26363/19545

Kay, R., & Lauricella, S. (2016). Assessing laptop use in higher education: The Laptop Use

Scale. Journal of Computing in Higher Education, 28(1), 18–44.

https://doi.org/10.1007/s12528-015-9106-5

Kieras, D. (2009). Model-based evaluation. The Human-Computer Interaction: Development

Process, 294–310.

Kim, E., Rothrock, L., & Freivalds, A. (2016). The effects of Gamification on engineering lab

activities. In Proceedings - Frontiers in Education Conference, FIE (Vol. 2016–Novem).

https://doi.org/10.1109/FIE.2016.7757442

Kim, E., Rothrock, L., & Freivalds, A. (2018). An Empirical Study on the Impact of Lab

Gamification on Engineering Students’ Satisfaction and Learning. International Journal of

Engineering Education, 34(1), 201–216.

Koestner, R., Ryan, R. M., Bernieri, F., & Holt, K. (1984). Setting limits on children’s behavior:

The differential effects of controlling vs. informational styles on intrinsic motivation and

creativity. Journal of Personality, 52(3), 233–248. https://doi.org/10.1111/j.1467-

6494.1984.tb00879.x

Komarraju, M., Karau, S. J., & Schmeck, R. R. (2009). Role of the Big Five personality traits in

predicting college students’ academic motivation and achievement. Learning and

Individual Differences, 19(1), 47–52. https://doi.org/10.1016/j.lindif.2008.07.001

Kraushaar, J. M., & Novak, D. C. (2010). Examining the affects of student multitasking with

laptops during the lecture. Journal of Information Systems Education, 21(2), 241–252.

Landers, R. ., Callan, R. ., Freitas, S. ., & Liarokapis, F. (2011). Serious games and edutainment

applications. In Serious Games and Edutainment Applications (pp. 9–23). Springer.

https://doi.org/10.1007/978-1-4471-2161-9

122

Lee, J. J. J., & Hammer, J. (2011). Gamification in Education: What , How , Why Bother?

Academic Exchange Quarterly, 15(2), 1–5. https://doi.org/10.1081/E-ELIS3-120043942

Lepper, M. R., & Greene, D. (1975). Turning play into work: Effects of adult surveillance and

extrinsic rewards on children’s intrinsic motivation. Journal of Personality and Social

Psychology, 31(3), 479–486. https://doi.org/10.1037/h0076484

Lepper, M. R., Greene, R., & Nisbet, R. (1973). Undermining children’s intrinsic interest with

extrinsic rewards: a test of the “over justification” hypothesis. Journal of Personality and

Social Psychology, 28(1), 129–137. Retrieved from

http://psycnet.apa.org/journals/psp/28/1/129/

Lewis, C., & Mack, R. (1982). Learning to use a text processing system: Evidence from

“thinking aloud” protocols. In Proceedings of the 1982 conference on Human factors in

computing systems (pp. 387–392). ACM.

Li, W., Grossman, T., & Fitzmaurice, G. (2012). GamiCAD: a gamified tutorial system for first

time autocad users. In Proceedings of the 25th annual ACM symposium on User interface

software and technology (pp. 103–112). ACM.

Lin, Y. G., McKeachie, W. J., & Kim, Y. C. (2001). College student intrinsic and/or extrinsic

motivation and learning. Learning and Individual Differences.

https://doi.org/10.1016/S1041-6080(02)00092-4

Lindroth, T., & Bergquist, M. (2010). Laptopers in an educational practice: Promoting the

personal learning situation. Computers and Education, 54(2), 311–320.

https://doi.org/10.1016/j.compedu.2009.07.014

Lounis, S., Pramatari, K., & Theotokis, A. (2014). Gamification is all about fun: The role of

incentive type and community collaboration. ECIS 2014 Proceedings, 1–14. Retrieved

from http://aisel.aisnet.org/ecis2014/proceedings/track12/13

MacKinnon W., D. (1944). The structure of personality. Ronald Press. Retrieved from

http://psycnet.apa.org/index.cfm?fa=search.displayrecord&uid=1944-19900-021

Malamed, C. (2012). The Gamification of Learning and Instruction: Game-Based Methods and

Strategies For Training And Education. eLearn Magazine (Vol. 2012). John Wiley &

Sons. https://doi.org/10.1145/2207270.2211316

Markland, D., & Tobin, V. (2004). A Modification to the Behavioural Regulation in Exercise

Questionnaire to Include an Assessment of Amotivation. Journal of Sport and Exercise

Psychology, 26(2), 191–196. https://doi.org/10.1123/jsep.26.2.191

Marshall, D., Foster, J. C., & Jack, M. A. (2001). User performance and attitude towards

schemes for alphanumeric data entry using restricted input devices. Behaviour &

Information Technology, 20(3), 167–188.

Maurer, T., Allen, D., Gatch, D. B., Shankar, P., & Sturges, D. (2012). Students’ academic

motivations in allied health classes. Internet Journal of Allied Health Sciences and

Practice, 10(1), 1–12.

123

Meera Komarraju , Ronald R. Schmeck, S. J. K. (2009). Role of the Big Five personality traits in

predicting college students’ academic motivation and achievement. Learning and

Individual Differences. https://doi.org/10.1016/j.lindif.2008.07.001

Mossholder, K. W. (1980). Effects of externally mediated goal setting on intrinsic motivation: A

laboratory experiment. Journal of Applied Psychology, 65(2), 202–210.

https://doi.org/10.1037/0021-9010.65.2.202

Nakayama, M., Mutsuura, K., & Yamamoto, H. (2014). Impact of learner’s characteristics and

learning behaviour on learning performance during a fully online course. Electronic

Journal of E-Learning, 12(4), 394–408. Retrieved from

http://files.eric.ed.gov/fulltext/EJ1035656.pdf

National Academy of Engineers. (2005). Educating the Engineering of 2020: Adapting

engineering education to the new century. National Academies Press.

Nicholson, S. (2012). A User-Centered Theoretical Framework for Meaningful Gamification.

Games+ Learning+ Society, 1–7. https://doi.org/10.1007/978-3-319-10208-5_1

Nicholson, S. (2015). A recipe for meaningful gamification. In Gamification in Education and

Business (pp. 1–20). https://doi.org/10.1007/978-3-319-10208-5_1

Nicol, D. (2007). E‐assessment by design: using multiple‐choice tests to good effect. Journal of

Further and Higher Education, 31(1), 53–64. https://doi.org/10.1080/03098770601167922

Nielsen, J. (1994a). Usability Engineering. Elsevier. https://doi.org/10.1145/1508044.1508050

Nielsen, J. (1994b). Usability inspection methods. Conference Companion on Human Factors in

Computing Systems - CHI ’94, 25(1), 413–414. https://doi.org/10.1145/259963.260531

Nielsen, J., & Landauer, T. (1993). A mathematical model of the finding of usability problems.

Proceedings of the INTERACT’93 and CHI’93. Retrieved from

http://dl.acm.org/citation.cfm?id=169166

Nielsen, J., & Norman, D. (2014). The definition of user experience. Nielsen Norman Group.

Nilsson, K. EL, & Warrén Stomberg, M. I. (2008). Nursing students motivation toward their

studies – a survey study. BMC Nursing, 7(1), 6. https://doi.org/10.1186/1472-6955-7-6

Noftle, E. E., & Robins, R. W. (2007). Personality predictors of academic outcomes: Big five

correlates of GPA and SAT scores. Journal of Personality and Social Psychology, 93(1),

116–130. https://doi.org/10.1037/0022-3514.93.1.116

Norman, D. (1988). The design of everyday things (originally published: The psychology of

everyday things). The Design of Everyday Things (Originally Published: The Psychology

of Everyday Things), 20.

Norman, D. (2002). Emotion & design: attractive things work better. Interactions, 9(4), 36–42.

Norman, D. A., & Draper, S. W. (1986). User centered system design: New perspectives on

124

human-computer interaction. CRC Press.

O’Brien, T. B., & DeLongis, A. (1996). The interactional context of problem-, emotion-, and

relationship-focused coping: the role of the big five personality factors. Journal of

Personality, 64(4), 775–813. https://doi.org/10.1111/j.1467-6494.1996.tb00944.x

O’Donovan, S., Gain, J., Marais, P., Donovan, S. O., & Marais, P. (2013). A Case Study in the

Gamification of a University-level Games Development Course. In ACM International

Conference Proceeding Series (pp. 242–251). https://doi.org/10.1145/2513456.2513469

Olmsted-Hawala, E., Murphy, E., Hawala, S., & Ashenfelter, K. T. (2010). Think-aloud

protocols: a comparison of three think-aloud protocols for use in testing data-dissemination

web sites for usability. Proceedings of the 28th International Conference on Human

Factors in Computing Systems, 2381–2390. https://doi.org/10.1145/1753326.1753685

Önder, İ., Beşoluk, Ş., İskender, M., Masal, E., & Demirhan, E. (2014). Circadian Preferences,

Sleep Quality and Sleep Patterns, Personality, Academic Motivation and Academic

Achievement of university students. Learning and Individual Differences, 32, 184–192.

https://doi.org/10.1016/j.lindif.2014.02.003

Pancake, C. M. (1998). Improving the quality of numerical software through user-centered

design. Lawrence Livermore National Lab., CA (United States).

Park, J., Chung, S., An, H., Park, S., Lee, C., Kim, S. Y., … Kim, K.-S. (2012). A structural

model of stress, motivation, and academic performance in medical students. Psychiatry

Investig, 9(2), 143–149. https://doi.org/10.4306/pi.2012.9.2.143

Petrides, K. V, Chamorro-Premuzic, T., Frederickson, N., & Furnham, A. (2005). Explaining

individual differences in scholastic behaviour and achievement. The British Journal of

Educational Psychology, 75(Pt 2), 239–255. https://doi.org/10.1348/000709904X24735

Pintrich, P. R., & Schunk, D. H. (2002). Motivation in education: Theory. Research, and

Applications, Second Edition, Merrill Prentice Hall, Columbus, Ohio.

Plaisant, C., & Shneiderman, B. (2005). Show me! Guidelines for producing recorded

demonstrations. In Visual Languages and Human-Centric Computing, 2005 IEEE

Symposium on (pp. 171–178). IEEE.

Preece, M. D. J., Ph, D. Y. R., & Sharp, M. D. H. (2002). Interaction Design.

Prince, M. (2004). Does Active Learning Work? A Review of the Research. Journal of

Engineering Research, 93(3), 223. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x

Ramey, J., Boren, T., Cuddihy, E., Dumas, J., Guan, Z., Van den Haak, M. J., & De Jong, M. D.

T. (2006). Does think aloud work?: how do we know? In CHI’06 Extended Abstracts on

Human Factors in Computing Systems (pp. 45–48). ACM.

Reiners, L. C. W., Hebbel-Seeger, A., Reiners, T., & Schäffer, D. (2014). Game-based elements

to upgrade bots to non-player characters in support of educators. In Synthetic Worlds (Vol.

NA, pp. 273–294). Springer. https://doi.org/10.1007/978-1-4614-6286-6

125

Reza, M., & Khan, F. (2014). Exploring the Influence of Big Five Personality Traits towards

Computer Based Learning ( CBL ) Adoption. Journal of Information Systems Research

and Innovation, 1–8. Retrieved from http://seminar.utmspace.edu.my/jisri/

Ryan, R. M. (1982). Control and information in the intrapersonal sphere: An extension of

cognitive evaluation theory. Journal of Personality and Social Psychology, 43(3), 450–

461. https://doi.org/10.1037//0022-3514.43.3.450

Ryan, R. M., & Deci, E. L. (2000a). Intrinsic and extrinsic motivation: Classic definitions and

new directions. Contemporary Educational Psychology, 25, 54–67.

https://doi.org/doi:10.1006/ceps.1999.1020

Ryan, R. M., & Deci, E. L. (2000b). Self-determination theory and the facilitation of intrinsic

motivation. American Psychologist, 55, 68–78. https://doi.org/10.1037/0003-066X.55.1.68

Seaborn, K., & Fels, D. I. (2015). Gamification in theory and action: A survey. International

Journal of Human Computer Studies, 74, 14–31.

https://doi.org/10.1016/j.ijhcs.2014.09.006

Sheldon, K. M., & Filak, V. (2008). Manipulating autonomy, competence, and relatedness

support in a game-learning context: New evidence that all three needs matter. British

Journal of Social Psychology, 47(2), 267–283. https://doi.org/10.1348/014466607X238797

Silva, L., Braga, J. C., Ghilardi-Lopes, N. P., Pinhata, E., Simões, E., Ribeiro, T., … Shinohara,

B. (2013). Educational game on global environmental changes: Collaborative design using

a social network. Proceedings of SBGames. São Paulo: Sociedade Brasileira de

Computação, 520–523.

Skolnik, R., & Puzo, M. (2008). Utilization of laptop computers in the school of business

classroom. Academy of Educational Leadership, 12(2), 1–10. Retrieved from

http://search.proquest.com/openview/3b8963133712a05c9a6910f86fc3cee8/1?pq-

origsite=gscholar&cbl=38741

Stavljanin, V., Milenkovic, I., & Sosevic, U. (2016). Educational Website Conversion

Improvement Using Gamification. INTERNATIONAL JOURNAL OF ENGINEERING

EDUCATION, 32(1), 563–573. Retrieved from

https://scholar.google.com/scholar?hl=en&q=Educational+Website+Conversion+Improve

ment+Using+Gamification&btnG=&as_sdt=1%2C39&as_sdtp=

Sturges, D., Maurer, T. W., Allen, D., Gatch, D. B., & Shankar, P. (2016). Academic

performance in human anatomy and physiology classes: a 2-yr study of academic

motivation and grade expectation. Advances in Physiology Education, 40(1), 26–31.

https://doi.org/10.1152/advan.00091.2015

Turner, E. A., Chandler, M., & Heffer, R. W. (2009). The Influence of Parenting Styles ,

Achievement Motivation , and Self-Efficacy on Academic Performance in College

Students. Journal of College Student Development, 50(3), 337–346.

https://doi.org/https://doi.org/10.1353/csd.0.0073

Umetsu, T., Hirashima, T., & Takeuchi, a. (2002). Fusion method for designing computer-based

126

learning game. In International Conference on Computers in Education, 2002.

Proceedings. (Vol. 1, pp. 124–128). IEEE. https://doi.org/10.1109/CIE.2002.1185882

Vansteenkiste, M., Smeets, S., Soenens, B., Lens, W., Matos, L., & Deci, E. L. (2010).

Autonomous and controlled regulation of performance-approach goals: Their relations to

perfectionism and educational outcomes. Motivation and Emotion, 34(4), 333–353.

Vansteenkiste, M., Zhou, M., Lens, W., & Soenens, B. (2005). Experiences of autonomy and

control among Chinese learners: Vitalizing or immobilizing? Journal of Educational

Psychology, 97(3), 468.

Vedel, A. (2014). The Big Five and tertiary academic performance: A systematic review and

meta-analysis. Personality and Individual Differences, 71, 66–76.

https://doi.org/10.1016/j.paid.2014.07.011

Wagerman, S. A., & Funder, D. C. (2007). Acquaintance reports of personality and academic

achievement: A case for conscientiousness. Journal of Research in Personality, 41(1),

221–229. https://doi.org/10.1016/j.jrp.2006.03.001

Wood, L. C., & Reiners, T. (2012). Gamification in Logistics and Supply Chain Education :

Extending Active Learning. IADIS Internet Technologies and Society,

2012(NOVEMBER), 101–108.

Yohannis, A. R., Prabowo, Y. D., & Waworuntu, A. (2014). Defining Gamification: From

lexical meaning and process viewpoint towards a gameful reality. In Information

Technology Systems and Innovation (ICITSI), 2014 International Conference on (pp. 284–

289). IEEE.

Yu, F., & Liu, Y. (2008). The comparative effects of student question-posing and question-

answering strategies on promoting college students’ academic achievement, cognitive and.

Journal of Education and Psychology. Retrieved from

http://conf.ncku.edu.tw/research/articles/e/20090807/2.pdf

Yu, F., Liu, Y., & Chan, T. (2005). A web‐based learning system for question‐posing and peer

assessment. Innovations in Education and Teaching International, 42(4), 337–348.

https://doi.org/10.1080/14703290500062557

Yuretich, R. F., Khan, S. a, Leckie, R. M., & Clement, J. J. (2001). Active-learning methods to

improve student performance and scientific interest in a large introductory oceanography

course. Journal of Geoscience Education, 49(2), 111–119. Retrieved from

http://www.scopus.com/inward/record.url?eid=2-s2.0-

0035004791&partnerID=40&md5=ed2302f7f1801dda1a1b9dce7fe148f4

Zhang, L. F. (2003). Does the big five predict learning approaches? Personality and Individual

Differences, 34(8), 1431–1446. https://doi.org/10.1016/S0191-8869(02)00125-3

Zusho, A., Pintrich, P., & Coppola, B. (2003). Skill and will: The role of motivation and

cognition in the learning of college chemistry. International Journal of Science Education,

25(9), 1081–1094. https://doi.org/10.1080/0950069032000052207

VITA

Eunsik Kim

EDUCATION

Ph.D. Candidate, Dept. of Industrial & Manufacturing Engineering

Graduated: August 2018

Penn State University, University Park, PA

Advisor: Andris Freivalds and Ling Rothrock

GPA: 3.90/4.0

M.S. Dept. of Industrial & Systems Engineering

Graduated: February 2013

University at Buffalo, Buffalo, NY

GPA: 3.88/4.0

M.E. Dept. of Industrial & Management Systems Engineering

Graduated: August 2010

Dong-A University, Busan, KOREA

Advisor: Hoonyoung Yoon

GPA: 4.38/4.5

B.S. Dept. of Industrial & Management Systems Engineering

Graduated: February 2008

Dong-A University, Busan, KOREA

GPA: 4.19/4.5

PUBLICATIONS

Kim, E., Rothrock, L., Freivalds, A. (2018), “An Empirical Study of Gamification Impact on

Engineering Lab Activity.” International Journal of Engineering Education, Vol. 34, No. 1, pp.

201-216.

Yoon, H., Kim, E. (2012), “Upper Limbs Related Muscle Strength and Fatigue During the

Wrench Job for Korean Young Aged.” Journal of the Society of Korea Industrial and Systems

Engineering, Vol. 35, No. 2, pp. 88-97.

Kim, E., Yoon, H. (2011), “Ergonomic Evaluation of Workload in Imbalanced Lower Limbs

Postures.” The Ergonomics Society of Korea, Vol. 30, No. 5, pp. 671-681.

Yoon, H., Kim E. (2009), “Muscle Strength Measurement using Shoulder and Upper Joint for

Korean Young aged.” Journal of the Ergonomics Society of Korea, Vol. 28, No. 3, pp. 125-134.