Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analysis and Recording

Embed Size (px)

Citation preview

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    1/24

    Online Homework versus Intelligent Tutoring Systems:Pedagogical Support for Transaction Analysis and Recording

    Fred Phillips*

    Edwards School of Business25 Campus Drive

    Saskatoon, SK, Canada S7N 5A7

    University of [email protected]

    Benny G. Johnson

    Quantum Simulations, Inc.5275 Sardis Road

    Murrysville, PA [email protected]

    November 13, 2009

    Key Words: Artificial intelligence, tutoring, online homework, accounting cycle

    * Corresponding author. We thank Linda Chase, Sara Harris, and David Johnson for assistance in

    developing the transaction analysis tutor described in this paper, and Sandy Hilton, Christine

    Kloezeman, Barbara Phillips, Regan Schmidt, Ganesh Vaidyanathan, two anonymous reviewers,an anonymous associate editor, and participants at the 2009 American Accounting Association

    and Canadian Academic Accounting Association annual conferences for comments on a prior

    version of the paper. This research was facilitated by financial support from the George C. Baxter

    Scholarship at the University of Saskatchewan. Data are available upon request from the first

    author. In the interests of full disclosure, Fred Phillips reports an indirect financial interest in the

    online homework system examined in this paper. Fred Phillips is a coauthor of textbookssupported by digital products that are based on the online homework system studied in this paper.

    Benny Johnson reports a direct financial interest in the intelligent tutoring system examined in

    this paper. Benny Johnson founded and operates the organization that developed and markets the

    intelligent tutoring system. These potential conflicts of interest work in competing directions;

    nonetheless, both authors assert that they have been objective in the research reported in this

    paper.

    mailto:[email protected]:[email protected]:[email protected]:[email protected]
  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    2/24

    Online Homework versus Intelligent Tutoring Systems:

    Pedagogical Support for Transaction Analysis and Recording

    Abstract

    Prior research demonstrates that students learn more from homework practice when using

    online homework or intelligent tutoring systems than a paper-and-pencil format.

    However, no accounting education research directly compares the learning effects of

    online homework systems with the learning effects of intelligent tutoring systems. This

    paper presents a quasi-experiment that compares the two systems and finds that students

    transaction analysis performance increased at a significantly faster rate when they used an

    intelligent tutoring system rather than an online homework system. Implications for

    accounting instructors and researchers are discussed.

    Key Words: Artificial intelligence, tutoring, online homework, accounting cycle

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    3/24

    Online Homework versus Intelligent Tutoring Systems:

    Pedagogical Support for Transaction Analysis and Recording

    INTRODUCTION

    Example-based instruction and homework practice have been key pedagogies in

    accounting for decades. Indeed, Ijiri (1983, 170) remarked in the inaugural volume of

    Issues in Accounting Education that (t)he practicality of accountants can perhaps be

    attributed largely to this mode of teaching and learning that seems to have existed since

    the dawn of accounting education. Over the years, educators have attempted to enhance

    this mode of teaching and learning (Schmidt and Bjork 1992; Bonner and Walker 1994;

    Wynder and Luckett 1999; Halabi, et al. 2005; Lindquist and Olsen 2007), most recently

    by developing new technologies such as online homework systems (Gaffney et al. 2009)

    and intelligent tutoring systems (Johnson et al. 2009). These systems provide greater

    practice and more timely feedback than ever before. As these systems become

    increasingly available, instructors will be expected to choose which system to

    recommend or require for students. The purpose of this paper is to inform such a choice

    by assessing the relative impact of an online homework system (OHS) and an intelligent

    tutoring system (ITS) on students learning.

    On the surface, both an OHS and an ITS offer benefits that make them appear

    equally desirable. For example, both systems offer the opportunity to practice and gain

    instant feedback on a seemingly unlimited number of algorithmically generated

    problems. Both systems also provide structures (e.g., response fields, drop-down menus)

    that may initially help students in breaking-down analyses and organizing their responses.

    An ITS claims to offer the added benefit of allowing students to ask questions of the

    tutoring system or to request that it demonstrate the steps needed to solve a particular

    1

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    4/24

    problem. While intended as a benefit, this feature could impede learning if students use it

    merely as a shortcut to a solution rather than as an enrichment of the learning process. As

    this discussion demonstrates, it is not possible to identify the most effective system by

    merely comparing the features of the two systems. Rather, empirical analysis is required.

    Prior accounting education research empirically compares paper-and-pencil

    homework with an OHS (Gaffney et al. 2009) and with an ITS (Johnson et al. 2009),

    finding that both systems enhance student performance. However, prior research does not

    directly compare the two systems with one another. In this study, we assess the relative

    effects of an OHS and ITS on student learning. Participants included 139 business

    students enrolled in four sections of an introductory financial accounting course. The first

    and third class sections completed a homework assignment using an OHS while the

    second and fourth class sections completed the same homework assignment using an ITS.

    All students were tested and then proceeded to complete a second homework assignment

    using the other system. Analyses of in-class tests held immediately before and after each

    homework assignment showed that, on average, students gained knowledge by

    completing homework practice. However, the gains were greater immediately after

    having used the ITS than after having used the OHS.

    This research makes an important contribution to both academic research and

    teaching. Until now, no research has directly compared an ITS with an OHS, despite their

    increasing prevalence in accounting education. We provide evidence that suggests

    students learn more when using an ITS than when using an OHS. We also discuss some

    of the system limitations that instructors may wish to contemplate before choosing

    between the two systems.

    2

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    5/24

    The following sections review prior research, describe the research method used

    to assess the relative impact of an OHS and ITS, present empirical results, and conclude

    with general observations, limitations, and directions for future research.

    BACKGROUND AND PRIOR RESEARCH

    Online Homework Systems

    Most textbook publishers provide an online homework system (OHS) for

    introductory and intermediate accounting courses.1 An OHS offers students the

    opportunity to gain immediate feedback on their answers to a seemingly unlimited

    number of algorithmically generated homework problems. It also offers instructors

    benefits, such as generating consistent and timely outcome feedback on large numbers of

    homework problems and automatically entering student grades into a gradebook.

    Although an OHS can have offsetting drawbacks (e.g., limited tolerance for minor

    departures from problem solutions), research suggests that an OHS contributes to student

    learning and that the majority of students prefer an OHS to paper-and-pencil homework

    (Bonham et al. 2001; Dillard-Eggars et al. 2008). This enhanced learning and preference

    for an OHS is believed to arise from the opportunity for frequent practice and the

    immediacy of outcome feedback, both of which have been shown in basic research to

    enhance learning (Balzer et al. 1989; Kulik and Kulik 1988).

    In accounting, two prior studies assess the impact of an OHS on student

    performance. Dillard-Eggars et al. (2008) studied whether accounting principles students

    course grades were associated with homework scores obtained when using an OHS.

    Results indicated a strong, positive relationship, suggesting that the completion of online

    1 Although user interfaces and brand names vary from one publisher to the next (e.g., CengageNow,

    WileyPlus), online homework systems share the same basic features discussed in this paper. The OHS used

    in this study was McGraw-Hills Homework Manager.

    3

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    6/24

    homework may have contributed to student learning. However, this research was not

    designed to determine whether the OHS generated incremental learning gains beyond

    other forms of homework. Thus, Gaffney et al. (2009) conducted a study in which they

    compared student performance across two sections of a financial accounting course; one

    section completed homework throughout the course using an OHS and the other section

    completed identical homework in a paper-and-pencil format. After statistically

    controlling for differences in student profiles between the two sections, analyses

    indicated that students in the OHS section outperformed those in the paper-and-pencil

    section on course components that resembled the homework.

    The results from the accounting studies are generally consistent with research in

    other disciplines. Studies in calculus, chemistry, and physics generally report that

    students score higher on tests when they have prepared for those tests using an OHS

    rather than a paper-and-pencil method (Arasasingham et al. 2005; Cheng et al. 2004;

    Dufresne et al. 2002; Hirsch and Weibel 2003; Zerr 2007).

    A few studies have found that an OHS does not outperform paper-and-pencil

    homework, but these studies have been plagued with confounding factors. In one study,

    paper-and-pencil homework scored better than an OHS but the paper-and-pencil

    homework was accompanied by extensive instructor debriefing and feedback on both

    outcomes and problem-solving processes whereas the OHS provided only outcome

    feedback (Bonham et al. 2003). In another study, students who had been assigned to a

    paper-and-pencil condition circumvented the experimental treatment by accessing the

    OHS via friends in other sections of the course (Cole and Todd 2003). In general, studies

    with strong research designs find that OHS contribute to better exam performance.

    4

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    7/24

    Despite offering benefits over paper-and-pencil homework, an OHS is not perfect.

    Feedback is often limited to the accuracy of students answers, remaining silent about the

    methods students use to reach their answers. Further, an OHS provides little help to

    students who do not know where to start because the system can assess only the

    outcome of the problem-solving process. To address these limitations, a system must be

    capable of reading and comprehending each students problem-solving analyses and offer

    direct instruction when requested by the student. The artificial intelligence used in an ITS

    provides a means of creating this additional support in an online platform. To our

    knowledge, only one such system currently exists in accounting. This system is discussed

    briefly below and is more fully described by Johnson et al. (2009).2

    Intelligent Tutoring Systems

    Prototypes of an ITS originated in chemistry, physics, and mathematics. Like an

    OHS, an ITS allows students to work on a seemingly unlimited number of homework

    problems that are either read into or algorithmically generated by the system. Unlike an

    OHS, however, an ITS provides feedback on not only the accuracy of an answer but also

    the process used to reach an answer (Johnson et al. 2009). Through this process

    orientation, an ITS provides explicit instruction on the process needed to reach a solution,

    thereby providing step-by-step assistance to students who have no idea where to start on a

    problem. Another advantage of an ITS is that it provides context-specific answers to

    students questions about concepts or applications of concepts. This context specificity

    allows students to obtain help on the particular problem they are working, rather than a

    generic problem that students may perceive as unrelated to their particular problem.

    2 A demo of the system, developed by Quantum Simulations, is available at quantumsimulations.com.

    5

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    8/24

    The ITS examined in this study focuses on transaction analysis. The tutor includes

    four main components: (1) comprehend problem information (i.e., lists of transactions),

    (2) identify the accounting equation effects of each transaction, (3) prepare journal entries

    to record these effects, and (4) post these journal entries to general ledger accounts

    (represented by T-accounts).

    The transaction analysis tutor allows a range of interactions similar to what could

    occur with a human tutor. For example, if a student does not know how to start a

    problem, he can ask the tutor to analyze each step of the problem for him using

    conversational-style natural language. On the other hand, if a student wishes, he can

    analyze the transaction without seeking help or feedback from the tutor. If errors arise in

    his analyses, the tutor will provide corrective feedback when he proceeds from one

    subgoal (e.g., analyzing accounting equation effects) to the next (preparing a journal

    entry). The student can ask the transaction analysis tutor to check his work, explain how

    the tutor would think through a particular part of the problem, or provide instruction on

    specific topics (e.g., why is contributed capital categorized as equity?). An important

    feature of the tutor is that it dynamically generates its explanations and instructional

    points for each individual student, based on the specific part of the particular problem on

    which each student is working and the particular responses the student has previously

    given. After each explanation or instruction, the tutor allows the student to ask as many

    follow-up questions as the student feels is necessary. This back-and-forth exchange

    focused on the problem-solving process is the primary reason for expecting that an ITS

    will be more effective than an OHS.

    6

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    9/24

    An initial study examining the effectiveness of the transaction analysis tutor was

    reported by Johnson et al. (2009). In that study, students were initially given a pretest and

    then one class section worked on homework using the ITS and another class section

    worked on the same homework in a paper-and-pencil format. Performance on a

    subsequent test differed significantly between groups; those who used the ITS enjoyed an

    improvement of 27 percentage points whereas those who completed paper-and-pencil

    homework improved only 8 percentage points. While these differences in performance

    are consistent with the ITS providing significant learning benefits over paper-and-pencil

    homework, they are subject to limitations. The ITS is an electronic media whereas paper-

    and-pencil homework is paper-based. Most students, including those who participated in

    the study, were accustomed to completing their homework using an electronic system not

    paper-and-pencil. In addition, the research design (i.e., a nonequivalent control group)

    used in the prior study did not fully control for alternative explanations of the research

    results. For example, differential maturation between groups could account for the

    differences in performance (Cook and Campbell 1979). To address these limitations, we

    designed the follow-up study described in the following section.

    METHOD

    Participants and Procedure

    This study was conducted with undergraduate business students registered in four

    sections of one instructors sophomore-level Financial Accounting course. The students

    had an average age of 19, had an average GPA of 2.94, and were split approximately

    equally between sexes (56% female). The students completed all homework assignments

    and tests, including those analyzed in this study, as part of their regular course

    7

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    10/24

    requirements. At the time they completed these course requirements, the students did not

    know that their homework and test responses might later be analyzed as part of a research

    study. After final course grades had been assigned, all students (n=192) were contacted

    by email and asked to become research participants, by giving their informed consent.

    Participation in this research did not require any additional work on the part of the

    students. After one follow-up request, 139 students (72%) consented to participate.

    Participants did not differ from non-participants in regard to their GPA (p=0.299) or

    proportion of men and women (p=0.669), suggesting that participants were representative

    of the larger student group.

    3

    To assess the effects of the different homework systems on student learning, we

    analyzed students grades from three tests that were administered in class around the time

    students were completing homework assignments on the topic of transaction analysis and

    recording (Chapters 2 and 3 of the textbook). The first test, administered on the third day

    of the course, assessed students ability to analyze and prepare journal entries for 10

    transactions that affected only balance sheet accounts. This 15-minute paper-and-pencil

    test was given immediately after formal in-class instruction on these topics, prior to any

    homework or tutorial practice on these topics. Before the next (fourth) class meeting,

    students completed a related set of homework problems requiring analysis and recording

    of 19 transactions. Depending on the condition to which students had been assigned,

    these homework problems were presented using either an OHS or ITS. At the beginning

    of the fourth class meeting, students completed a second (20-minute) paper-and-pencil

    test, comprising 14 transactions. Like the first test, this test assessed students ability to

    3 The approved research ethics protocol for this study prevents us from analyzing and reporting non-

    participants performance data relating to specific elements of the course.

    8

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    11/24

    analyze and prepare journal entries for transactions that affected only balance sheet

    accounts. The remainder of the class meeting was used to instruct students on analyzing

    and preparing journal entries for transactions that affect both balance sheet and income

    statement accounts. Before the fifth class meeting, students completed a related set of

    homework problems requiring analysis and recording of 28 transactions, using either the

    OHS or the ITS. At the beginning of the fifth class meeting, students completed a third

    (24-minute) paper-and-pencil test comprising 17 transactions that affected both balance

    sheet and income statement accounts.4

    To ensure consistency in instruction, the four sections were taught by the same

    instructor in sequential 80-minute classes.5 The required class materials in the four

    sections were identical, consisting of a textbook and a set of student class notes that were

    synchronized with the content of the instructors PowerPoint slides. As in Gaffney et al.

    (2009), the OHS allowed students to access homework assignments between specified

    dates, receive preliminary feedback on the accuracy of their homework answers, change

    their answers, and submit them for grading at any time prior to the deadlines.

    Research Design

    To familiarize students with the OHS that would be used later in the course, we

    required all students to use the OHS to complete an initial homework assignment on

    topics in Chapter 1 of the text (e.g., the financial statements and their main users). This

    homework assignment did not involve transaction analysis or journal entries, which are

    4 Presenting transactions as balance sheet-only effects or combined balance sheet and income statement

    effects was consistent with the way they were presented in the related textbook chapters. The test and

    homework questions are available from the first author on request.5 The instructor is an author of this paper. One potential concern with this relationship is that students might

    feel pressured to participate unwillingly in the research. To minimize this risk, our REB-approved research

    protocol required that students be invited to participate in the study only after final grades were assigned

    and approved by the department head.

    9

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    12/24

    the focus of this study, but it did provide data that we used as a covariate in our analyses.

    Following the completion of Chapter 1 homework, the four sections of the course were

    assigned randomly to one of two conditions, in a removed treatment nonequivalent

    control group with pretest and posttest design (Cook and Campbell 1979). In this design,

    the ITS treatment is given to one group of students and withheld from the other group,

    which instead completes identical homework problems using the OHS. These conditions

    are then reversed for the subsequent homework assignment. Using notation similar to

    Cook and Campbell, but designating test scores as T and the experimental treatments as

    ITS or OHS, this design can be diagrammed as follows.

    GroupCh. 1

    Homework Class 3Ch.2 Homework

    (Treatment 1) Class 4Ch. 3 Homework

    (Treatment 2) Class 5

    ITS Earlier OHS T1 ITS T2 OHS T3ITS Later OHS T1 OHS T2 ITS T3

    In anticipation that students characteristics might differ between sections, we

    alternated the assignment of sections to experimental conditions such that the first and

    third class sections were assigned to the ITS Earlier group and the second and fourth class

    sections were assigned to the ITS Later group.6 An alternative design would have been to

    randomly assign individual students rather than entire course sections to the experimental

    conditions, but this design was not chosen for two reasons. First, to ensure equity in final

    grade assignment, our universitys REB recommended that grades be standardized across

    class sections based on each sections midterm and final exam performance. By assigning

    class sections rather than individual students to experimental treatments, we were able to

    implement this recommended method of grade determination and adjust for differences in

    6 This design choice was intended to evenly distribute varying student characteristics across the two

    conditions. It also reduces the effects of information leakage (about test content) from early class sections

    to late sections, should any exist.

    10

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    13/24

    homework scores that arose between class sections. Second, assigning experimental

    treatments to class sections rather than to individual students reduced the risk that students

    would question their use of different homework systems for the same homework

    assignments. We wanted students to perceive the homework as homogeneous within each

    class section; anecdotal observations suggested that we were successful in this because no

    students commented on the cross-sectional variation in homework completion method.

    RESULTS

    Analyses of Homework Practice

    For the OHS and ITS to be effective, students must actively engage with them

    when attempting the homework practice. Data limitations prevented us from comparing

    the time spent by students using the OHS and ITS, but we were able to ensure they had

    fully completed the homework assignments. Students in the ITS Earlier group completed

    100 percent of the problems in the Chapter 2 homework assignment using the ITS and

    earned an average grade of 95 percent when completing the Chapter 3 homework

    assignment using the OHS. Students in the ITS Later group earned an average grade of

    94 percent when completing the Chapter 2 homework assignment using the OHS and

    completed all the problems in the Chapter 3 homework assignment using the ITS.

    Analyses of Test Scores

    A multivariate analysis of variance (MANOVA) was conducted to determine

    whether the two experimental conditions differed in overall test performance. Results

    indicate that student performance differed between the experimental conditions (Wilks

    =0.949, F=2.415, p=0.035). Average scores on each of the three tests are reported by

    condition in Table 1, along with the average Chapter 1 homework score and GPA.

    11

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    14/24

    Insert Table 1 about here

    Despite alternating the assignment of course sections to condition, we noted that

    the average Test 1 score of students in the ITS Later group (80.7 percent) was greater

    than that in the ITS Earlier group (72.9 percent) at a statistically significant level (t=2.35,

    p=0.020). A similar pattern exists in average Chapter 1 homework scores and GPAs, with

    the ITS Later group exceeding that of the ITS Earlier group (t=3.28,p=0.001 and t=2.44,

    p=0.016, respectively). These differences are not caused by the experimental treatments

    because the treatments were administered later, after Chapter 1 homework and Test 1 had

    been completed. Instead, these differences suggest that the ITS Later group comprised

    higher performing students than the ITS Earlier group.

    To control for these differences, we calculated gain scores for each student by

    subtracting from each students subsequent test scores his or her immediately preceding

    test score. These gain scores control for differences in each individuals initial level of

    performance by examining incremental gains, similar to within-subject analyses of

    variance (Rosenthal and Rosnow 1985). Gain scores offer the added benefit of ease of

    interpretation. That is, gain scores directly measure performance improvements over

    time. For example, by subtracting Test 1 scores from Test 2, we obtain EarlyGain, which

    indicates the extent to which each students performance improved between Tests 1 and

    2. If students learn more when completing homework with the ITS than the OHS,

    EarlyGain will be greater in the ITS Earlier group than in the ITS Later group.7

    EarlyGain scores, reported in the middle columns of Table 2 Panel A, indicate

    that the performance of the ITS Earlier group increased by 11.8 percent between Tests 1

    7 We also attempted to control for differences by reanalyzing the data using the Chapter 1 homework score

    and GPA as covariates. These measures were positively correlated with Test 1 scores (p

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    15/24

    and 2. A one-sample t-test indicates that this difference was greater than zero (t=5.06,

    p

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    16/24

    al. (2009), which found that students (who performed relatively poorly on an initial test)

    improved more after completing homework with an ITS than peers who completed the

    same homework using paper-and-pencil. To improve on the design of Johnson et al.

    (2009) and rule-out the differential maturation explanation, we incorporated a removed

    treatment design in the current study. By removing the ITS treatment from the ITS

    Earlier group and giving it to the ITS Later group near the midpoint of our study, we are

    able to place the effects of differential maturation in direct opposition to the effects of the

    experimental treatments. That is, if differential maturation accounted for the ITS Earlier

    groups greater improvement between Tests 1 and 2, the ITS Earlier group would

    continue to enjoy greater (or, at worst, equal) gain scores between Tests 2 and 3. If, on

    the other hand, the ITS treatment caused the different levels of improvement between

    Tests 1 and 2, then providing the ITS treatment to the ITS Later group would reverse the

    pattern of differences; the ITS Later group would improve more between Tests 2 and 3

    than the ITS Earlier group.

    To determine the extent to which performance improved between Tests 2 and 3,

    we calculated LateGain by subtracting Test 2 scores from Test 3. The middle columns of

    Table 2 Panel B report descriptive statistics for LateGain, which indicate that the

    performance of the ITS Earlier group increased by 2.0 percent between Tests 2 and 3. A

    one-sample t-test indicates that this increase was not statistically different from zero

    (t=1.37, p=0.177). In contrast, the ITS Later group increased its average performance

    between Tests 2 and 3 by 5.2 percent. This improvement was statistically greater than

    zero (t=3.38, p=0.001). The right-hand column of Table 2 confirms these results by

    reporting t-test comparisons of the LateGain scores for the ITS Earlier and ITS Later

    14

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    17/24

    groups. As shown, the ITS Earlier groups improvement of 2.0 percent was less than the

    ITS Later groups improvement of 5.2 (t=1.49,p=0.069).8These results suggest that the

    students who used the ITS between Tests 2 and 3 learned more than the students who

    used the OHS during that time. Importantly, these results differ from the pattern that

    would be expected if groups matured at different rates as a result of their different

    starting points.

    DISCUSSION AND CONCLUSION

    This study examined the performance of students who had practiced transaction

    analysis and recording using an online homework system (OHS) and an online intelligent

    tutoring system (ITS). Results indicated that students ability to account for transactions

    increased over the period of the study, but at a significantly faster rate when students used

    the ITS rather than the OHS. These effects are noteworthy because they do not merely

    reflect an active learning effect; students were actively engaged in homework practice

    whether it was presented via the OHS or the ITS. Also, the results were not driven by

    mere differences in modality (i.e., online versus paper-and-pencil), which is a potential

    weakness of prior studies (e.g., Johnson et al. 2009); in the current study, both systems

    were accessed online.

    If the performance differences were not attributable to differences in active

    learning or modality, to what were they attributable? We believe two key attributes of

    these systems were responsible for causing the performance differences. First, the nature

    of feedback provided by each system differed markedly. Although feedback from the

    OHS was immediate, it was limited solely to the accuracy of each student response. After

    8 The likelihood of detecting increased scores between Tests 2 and 3 is lower than that between Tests 1 and

    2; at Test 2, the average student score in both groups was within one standard deviation of 97 percent.

    15

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    18/24

    receiving this outcome feedback, students were left to infer for themselves which element

    of their thinking process went astray. Perhaps they recorded the wrong account type (e.g.,

    an asset rather than a revenue), wrong account name (e.g., inventory rather than supplies),

    or wrong amount (e.g., a partial payment rather than the entire purchase cost); a stumble

    with any one of these elements would lead to an incorrect answer. In contrast, the ITS

    provided detailed feedback on these elements. Students using the ITS learned not only

    whether the final outcome was accurate, but also whether the thinking process that led to

    that final outcome was sound.

    A second key difference between the OHS and the ITS related to the amount of

    instructional support they provided students. The ITS was designed to support students

    throughout the problem-solving process. Consequently, students could ask the system

    questions tailored specifically to their particular point of uncertainty or confusion. A

    student who was unsure why a particular event would be considered an accounting

    transaction was able to ask the system to explain that point. Another student who

    understood this previous point but did not see the distinction between assets and revenues

    was able to ask the ITS to explain that later stage of the problem-solving process. A

    student who did not know where to start could ask the ITS to work through and

    explicitly model each step of the thinking process, just as a human tutor would. In

    contrast, an OHS is primarily concerned with assessing outcomes, so it provides

    relatively less instructional support. This point is not a criticism of an OHS, but rather

    just an acknowledgement of a different focus. Many participants in our study anecdotally

    commented that they appreciated the assessment function that the OHS provided and

    some criticized the ITS for its emphasis on helping rather than assessing. These latter

    16

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    19/24

    students apparently preferred to use homework practice as an early signal of exam

    performance rather than an opportunity to build knowledge and skills.9

    Although we are able to describe key differences in the feedback and instructional

    support that the two systems provide, our research did not directly test the impact of these

    differences on student learning. This limitation provides a worthwhile direction for future

    research. Also, being one of the first studies in this area, our research did not manipulate

    the many features available in an OHS or ITS to determine their impact on student

    learning. Consequently, we cannot conclude that the observed differences in student

    learning generalize beyond the specific ways in which the two systems were implemented

    in this study. A final limitation is that the current study examines only a subset of the

    potential costs and benefits of an OHS and ITS. Beyond their impact on student exam

    performance, these systems are likely to impact student perceptions, instructor efficiency,

    and possibly instructor effectiveness. As Ijiri (1983, 173) noted more than twenty-five

    years ago, computers can make accounting education more efficient (and) leave us

    more time in the classroom to spend on the other important dimension of accounting,

    namely the human dimension. We hope that the research reported in this paper will

    encourage accounting educators to take advantage of these opportunities.

    9 Another limitation of the ITS examined in this study is that it covers only a subset of financial accounting

    topics (i.e., transaction analysis, adjustments, and financial statement preparation). An ITS has not yet been

    developed for topics such as inventory costing, depreciation, and so on.

    17

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    20/24

    References

    Arasasingham, R. D., M. Taagepera, F. Potter, I. Martorell, and S. Lonjers. 2005.

    Assessing the effect of web-based learning tools on student understanding of

    stoichiometry using knowledge space theory. Journal of Chemical Education 82

    (August): 1251-1262.

    Balzer, W. K., M. E. Doherty, and R. OConnor. 1989. Effects of cognitive feedback on

    performance.Psychological Bulletin 106: 410-433.

    Bonham, S. W., D. L. Deardorff, and R. J. Beichner. 2003. Comparison of student

    performance using web and paper-based homework in college-level physics,

    Journal of Research in Science Teaching40 (December): 1050-1071.

    Bonham, S. W., R. J. Beichner, and D. L. Deardorff. 2001. Online homework: Does it

    make a difference? The Physics Teacher39 (May): 293-296.

    Bonner, S. E., and P. L. Walker. 1994. The effects of instruction and experience in the

    acquisition of auditing knowledge. The Accounting Review 69 (1): 157-178.

    Borthick, A. F., M. B. Curtis, R. S. Sriram. 2006. Accelerating the acquisition of

    knowledge structure to improve performance in internal control reviews.

    Accounting, Organizations and Society 31(4-5): 323-342.

    Cheng, K. K., B. A. Thacker, R. L. Cardenas, and C. Crouch. 2004. Using an online

    homework system enhances students learning of physics in an introductory

    physics course.American Journal of Physics 72: 1447-1453.

    Cole, R. S., and J. B. Todd. 2003. Effects of web-based multimedia homework with

    immediate rich feedback on student learning in general chemistry. Journal of

    Chemical Education 80 (November): 1338-1343.

    18

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    21/24

    Cook, T. D., and D. T. Campbell. 1979. Quasi-Experimentation: Design & Analysis

    Issues for Field Settings. Boston, MA: Houghton Mifflin Company.

    Dillard-Eggars, J., T. Wooten, B. Childs, and J. Coker. 2008. Evidence on the

    effectiveness of on-line homework. College Teaching Methods & Styles Journal4

    (May): 9-15.

    Dufresne, R. J., J. P. Mestre, D. M. Hart, and K. A. Rath. 2002. The effect of web-based

    homework on test performance in large enrollment introductory physics courses,

    Journal of Computers in Mathematics and Science Teaching21: 229-251.

    Gaffney, M. A., D. Ryan, and C. Wurst. 2009. Do on-line homework systems improve

    performance? Working paper, Temple University.

    Halabi, A. K., J. E. Tuovinen, and Farley, A. A. 2005. The cognitive load of computer

    based learning materials for introductory accounting. Issues in Accounting

    Education 20: 21-32.

    Hirsch, L., and C. Weibel. 2003. Statistical evidence that web-based homework helps.

    Focus 23 (February): 14.

    Ijiri, Y. 1983. New dimensions in accounting education: Computers and algorithms.

    Issues in Accounting Education 1: 168-173.

    Johnson, B. G., F. Phillips, and L. G. Chase. 2009. An intelligent tutoring system for the

    accounting cycle: Enhancing textbook homework with artificial intelligence.

    Journal of Accounting Education 27 (March): 30-39.

    Kulik, J. A., and C. C. Kulik. 1998. Timing of feedback and verbal learning. Review of

    Educational Research 58 (Spring): 79-97.

    19

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    22/24

    Lindquist, T. M., and L. M. Olsen. 2007. How much help, is too much help? An

    experimental investigation of the use of check figures and completed solutions in

    teaching intermediate accounting.Journal of Accounting Education 25(1-2): 103-

    117.

    McNamara, D. S., E. Kintsch, N. B. Songer, and W. Kintsch. 1996. Are good texts

    always better? Interactions of text coherence, background knowledge and levels

    of understanding in learning from text. Cognition and Instruction 14: 1143.

    Rosenthal, R., and R. Rosnow. 1985. Contrast Analysis: Focused Comparisons in the

    Analysis of Variance. New York: Cambridge University Press.

    Schmidt, R. A., and R. A. Bjork. 1992. New conceptualizations of practice: Common

    principles in three paradigms suggest new concepts for training. Psychological

    Science 3(4): 207217.

    Wynder, M. B., and P. F. Luckett. 1999. The effects of understanding rules and a worked

    example on the acquisition of procedural knowledge and task performance.

    Accounting and Finance 39 (2): 177-203.

    Zerr, R. 2007. A quantitative and qualitative analysis of the effectiveness of online

    homework in first-semester calculus. Journal of Computers in Mathematics and

    Science Teaching26(1): 55-73.

    20

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    23/24

    Table 1

    Analysis of Mean Performance Scores

    (Standard Deviation in Parentheses)

    Performance ScoreITS Earlier

    (n = 71)ITS Later

    (n = 68) Test Statistics

    Test 1 72.9 (21.9) 80.7 (16.5) t=2.35, p = 0.020

    Test 2 84.7 (12.7) 86.8 (10.0) t=1.05, p = 0.294

    Test 3 86.7 (14.6) 92.0 (13.4) t=2.19,p = 0.030

    Chapter 1 Homework 91.8 (0.05) 94.6 (0.05) t=3.28,p = 0.001

    GPA 2.88 (0.29) 3.00 (0.27) t=2.44,p = 0.016

    21

  • 8/14/2019 Phillips Johnson: Online Homework versus Intelligent Tutoring Systems: Pedagogical Support for Transaction Analys

    24/24

    Table 2

    Analysis of Mean Gain Scores

    (Standard Deviation in Parentheses)

    ITS Earlier

    (n = 71)ITS Later

    (n = 68) Test Statistics

    Panel A:

    EarlyGain* 11.8 (19.5) 6.1 (15.9) t=1.89,p=0.030

    Difference from zero t=5.06,p