4
Validating Instrument Quality for Measuring Students’ Acceptance of an Online Discussion Site (ODS) Prasanna Ramakrisnan Faculty Computer and Mathematical Sciences (FSKM) University Technology MARA (UiTM) Shah Alam, Malaysia [email protected] Azizah Jaafar Institute of Visual Informatics (IVI) National University of Malaysia (UKM) Bangi, Malaysia [email protected] Noor Faezah Mohd Yatim Faculty Infomation Science and Technology (FTSM) National University of Malaysia (UKM) Bangi, Malaysia [email protected] Mohd Noor Mamat i-Learn Centre (i-LeC) University Technology MARA (UiTM) Shah Alam, Malaysia [email protected] AbstractThe primary purpose of this study was to provide validity evidence for the items used to measure the intention to use an Online Discussion Site (ODS). Seventy seven students participated in this study. The extended Technology Acceptance Model (TAM) model was adapted for this study. With the use of the Rasch analysis, the item validity in extended TAM model was evaluated. There are a few aspects of validity in relation to the Rasch analysis. This paper discusses validity in terms of item quality. To maximize the quality of items in the model, the item removal process was conducted. There were two criteria used to identify the quality of items; (1) point measure correlation and (2) fit statistics. Items that did not meet these two criteria were eliminated. There were 24 items in the original model but results show that only 23 items can be used for modelling students’ intention to use the ODS. Keywords-Intention to Use, Online Discussion Site (ODS), Rasch Analysis I. INTRODUCTION Online Discussion Sites (ODS) are widely being used in many universities as a medium to assist in the teaching and learning of blended learning courses. Blended learning courses require face to face sessions as well as online sessions. The face to face sessions are conducted in classrooms while online sessions are usually conducted via ODS. The ODS is a tool that is embedded in an e-learning platform. In Universiti Teknologi MARA (UiTM), students taking blended learning courses are connected to each other in the ODS system using the i-Learn Portal (e-learning platform). They are able to discuss tutorial and subject related questions anytime online regardless of their location. As one of the largest university in Malaysia, UiTM has the highest enrolment of students every semester. With the increase in the number of students, many generic courses are now being converted to the blended learning mode to reduce classroom utilization hours. For successful implementation of blended learning, the ODS discussion platform needs to be used by the students to address subject related questions. There are many factors that influence the students’ usage of the ODS. But the interest of this study is to identify factors that influence students’ intention to use the ODS. Therefore, the extended TAM model [1] was validated to determine whether it is a valid instrument for accessing students’ intention to use the ODS. II. INTENTION TO USE An individual’s intention and usage of technology is commonly studied using the Technology Acceptance Model (TAM). TAM was developed and validated by Davis based on the Theory of Reasoned Action (TRA) [2], [3]. This model suggests that an individual’s attitude towards using a particular system is influenced by its perceived ease of use and perceived usefulness. Davis [3] defined perceived ease of use (PEU) as “the degree to which an individual believes that using a particular system would be free of physical and mental effort” and perceived usefulness (PU) as “the degree of which a person believes that using a particular system would enhance his or her job performance”. There are many previous studies which have applied the TAM model to investigate students’ intention to use e- learning technologies [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16]. Examples of technologies that support e-learning are the learning management system, blogs, the internet forum, social networking sites, wikis, instant messaging etc. However most of studies used the TAM to examine the concept of e-learning systems. But the extended TAM model was proposed for studying intention to use an online learning community [1]. III. METHOD A. Participants The data was collected from full time undergraduate students at University Teknologi MARA (UiTM) who use the i-learn portal, the e-learning portal for their academic discussions. Seventy seven responses were collected from the survey questionnaires distributed manually and online. 2013 International Conference on Advanced Computer Science Applications and Technologies 978-1-4799-2758-6/13 $31.00 © 2013 IEEE DOI 10.1109/ACSAT.2013.99 475

[IEEE 2013 International Conference on Advanced Computer Science Applications and Technologies (ACSAT) - Kuching, Malaysia (2013.12.23-2013.12.24)] 2013 International Conference on

Embed Size (px)

Citation preview

Page 1: [IEEE 2013 International Conference on Advanced Computer Science Applications and Technologies (ACSAT) - Kuching, Malaysia (2013.12.23-2013.12.24)] 2013 International Conference on

Validating Instrument Quality for Measuring Students’ Acceptance of an Online Discussion Site (ODS)

Prasanna Ramakrisnan Faculty Computer and Mathematical Sciences

(FSKM) University Technology

MARA (UiTM) Shah Alam, Malaysia

[email protected]

Azizah Jaafar Institute of Visual Informatics (IVI)

National University of Malaysia (UKM) Bangi, [email protected]

Noor Faezah Mohd YatimFaculty Infomation

Science and Technology (FTSM)

National University of Malaysia (UKM)

Bangi, Malaysia [email protected]

Mohd Noor Mamat i-Learn Centre (i-LeC) University Technology

MARA (UiTM) Shah Alam, Malaysia

[email protected]

Abstract—The primary purpose of this study was to provide validity evidence for the items used to measure the intention to use an Online Discussion Site (ODS). Seventy seven students participated in this study. The extended Technology Acceptance Model (TAM) model was adapted for this study. With the use of the Rasch analysis, the item validity in extended TAM model was evaluated. There are a few aspects of validity in relation to the Rasch analysis. This paper discusses validity in terms of item quality. To maximize the quality of items in the model, the item removal process was conducted. There were two criteria used to identify the quality of items; (1) point measure correlation and (2) fit statistics. Items that did not meet these two criteria were eliminated.There were 24 items in the original model but results show that only 23 items can be used for modelling students’ intention to use the ODS.

Keywords-Intention to Use, Online Discussion Site (ODS), Rasch Analysis

I. INTRODUCTION

Online Discussion Sites (ODS) are widely being used in many universities as a medium to assist in the teaching and learning of blended learning courses. Blended learning courses require face to face sessions as well as online sessions. The face to face sessions are conducted in classrooms while online sessions are usually conducted via ODS. The ODS is a tool that is embedded in an e-learning platform.

In Universiti Teknologi MARA (UiTM), students taking blended learning courses are connected to each other in the ODS system using the i-Learn Portal (e-learning platform).They are able to discuss tutorial and subject related questions anytime online regardless of their location. As one of the largest university in Malaysia, UiTM has the highest enrolment of students every semester. With the increase in the number of students, many generic courses are now being converted to the blended learning mode to reduce classroom utilization hours.

For successful implementation of blended learning, the ODS discussion platform needs to be used by the students to address subject related questions. There are many factors that influence the students’ usage of the ODS. But the interest of

this study is to identify factors that influence students’ intention to use the ODS. Therefore, the extended TAM model [1] was validated to determine whether it is a valid instrument for accessing students’ intention to use the ODS.

II. INTENTION TO USE

An individual’s intention and usage of technology is commonly studied using the Technology Acceptance Model (TAM). TAM was developed and validated by Davis based on the Theory of Reasoned Action (TRA) [2], [3]. This model suggests that an individual’s attitude towards using a particular system is influenced by its perceived ease of use and perceived usefulness.

Davis [3] defined perceived ease of use (PEU) as “the degree to which an individual believes that using a particular system would be free of physical and mental effort” and perceived usefulness (PU) as “the degree of which a person believes that using a particular system would enhance his or her job performance”.

There are many previous studies which have applied the TAM model to investigate students’ intention to use e-learning technologies [4], [5], [6], [7], [8], [9], [10], [11], [12], [13], [14], [15], [16]. Examples of technologies that support e-learning are the learning management system, blogs, the internet forum, social networking sites, wikis, instant messaging etc. However most of studies used the TAM to examine the concept of e-learning systems. But the extended TAM model was proposed for studying intention to use an online learning community [1].

III. METHOD

A. Participants The data was collected from full time undergraduate

students at University Teknologi MARA (UiTM) who use the i-learn portal, the e-learning portal for their academic discussions. Seventy seven responses were collected from the survey questionnaires distributed manually and online.

2013 International Conference on Advanced Computer Science Applications and Technologies

978-1-4799-2758-6/13 $31.00 © 2013 IEEE

DOI 10.1109/ACSAT.2013.99

475

Page 2: [IEEE 2013 International Conference on Advanced Computer Science Applications and Technologies (ACSAT) - Kuching, Malaysia (2013.12.23-2013.12.24)] 2013 International Conference on

B. Instrument An instrument adopted for this study were designed by

Liu et al [1]. The response categories used for this instrument were (1) strongly disagree, (2) disagree, (3) agree, and (4) strongly agree. Table I summarizes the instrument used for this study.

C. Data Analysis The instrument consisting of 24 items was administered

manually and online to the students. The data collected was analysed using the Rasch analysis software (WINSTEP 3.68.2). The purpose of this study was to scrutinize the instrument constructs and identify whether all the 24 items are required for investigating intention to use the ODS. This study will show all the required procedure to examine construct validity.

TABLE I. THE ITEMS OF THE INSTRUMENT (ADAPTED FROM[1])

Item StatementOnline Course Design (OCD)

OCD1 The course content is interestingOCD2 The course content level is mid-rangeOCD3 The course content meets my needsOCD4 In general, I am satisfied with the design of the course

content and qualityUser-interface Design (UID)

UID1 The layout design of the Online Discussion Site (ODS) makes it easy to read.

UID2 The font style, colour and layout of the interface make it comfortable for me to read.

UID3 In general, I am satisfied with the design of the Online Discussion Site (ODS) interface.

Previous Online Learning Experience (POLE)POLE1 I feel it would easier to operate the Online Discussion Site

(ODS) if I had previous experience of using it.POLE2 I will have a better understanding of how to use the Online

Discussion Site (ODS) if it has a function for online guidance.

POLE3 I will have a better understanding of how to use the Online Discussion Site (ODS) if a lecturer or peer operates it first.

Perceived Usefulness (PU)PU1 I could improve my learning performance by using the

Online Discussion Site (ODS).PU2 I could improve my learning by using the Online

Discussion Site (ODS).PU3 I could increase my learning productivity by using the

Online Discussion Site (ODS).PU4 I think using the Online Discussion Site (ODS) helps me

learn.Perceived Ease of Use (PEOU)

PEOU1 I feel that the interface design and information delivery in Online Discussion Site (ODS) are clear and easy to understand.

PEOU2 It is easy for me to do the things that I want to do by operating this Online Discussion Site (ODS).

PEOU3 I feel the Online Discussion Site (ODS) is easy to handle when I encounter a problem.

PEOU4 In general, I feel it is easy for me to use the Online Discussion Site (ODS).

Perceived Interaction (PI)PI1 I discuss relevant learning topics with other members on

the Online Discussion Site (ODS).PI2 I’m able to send e-mails to other members as a way of

communicating.PI3 I’m able to engage in real-time learning interaction with

other members in the Online Discussion Site (ODS).PI4 In general, I think this Online Discussion Site (ODS)

provides good opportunities for interaction with other users.

Intention to Use an Online Discussion Site (IUODS)IUODS1 I intend to use this Online Discussion Site (ODS) for

activities that involve learning.IUODS2 I will reuse this Online Discussion Site (ODS) for relevant

learning activities.

1) The Rasch Model The Rasch model was applied to measure latent traits or

a person’s ability in various disciplines. Latent traits are usually assessed trough the responses of a sample of users to a set measurement scale. Location of items and users of the measurement scale are estimated by the model from the proportion of responses of each user to each item.

The probability of success depends on the differences between the ability of the person and the difficulty of the item. According to the Rasch model, a user who is more developed has a greater likelihood of endorsing all the items; and easier tasks are more likely to be endorsed by all users [17].

The item difficulty and person ability are expressed in logits through transformation of the raw score (ordinal scale) percentage into success-to-failure ratio or odds. This odds value is then converted to its natural logs (interval scale). The scale resulting from the Rasch analysis of the ordinal response has the properties of an interval scale. This scale is linear, and the numbers tell how much more of the attribute of interest is present.

The basic assumption of the Rasch model is firstly each user is categorized by his or her ability; and each item by a difficulty. Secondly the user and item can be presented by numbers along one line and lastly the probability of observing any particular scored responses can be computed from the differences between the numbers [17]. Thus the model can be used to link the person to the items that have a relative ordering of latent variables.

The validity testing of the intention to use ODS instrument was done by applying Rasch model for data analysis. The item hierarchy obtained from the person-item distribution map provides an indication of construct validity [18]. The use of the Rasch model is for validating measures of constructs such as the Online Course Design (OCD), the User-interface Design (UID), Previous Online Learning Experiences (POLE), Perceived Usefulness (PU), Perceived Ease of Use (PEOU), Perceived Interaction (PI) and Intention to Use an Online Learning Community (IUOLC) in the intention to use the ODS instrument.

IV. FINDING AND ANALYSIS

Instrument fit was performed to identify how well each item fits within the underlying construct in the instrument

476

Page 3: [IEEE 2013 International Conference on Advanced Computer Science Applications and Technologies (ACSAT) - Kuching, Malaysia (2013.12.23-2013.12.24)] 2013 International Conference on

and to remove invalid items from the instrument. The item removal process was conducted to maximize item quality. Item quality was examined using the criteria shown in Table II below.

TABLE II. CRITERIA USED FOR REMOVAL OF THE ITEMSCriteria Cut of Point

Point Measure Correlation 0.32 < x < 0.8Infit / Outfit Mean Square 0.5 < y < 1.5Infit / Outfit Z Standard -2.0 < Z < 2.0

Point measure correlation was used to identify the validity of the responses provided by the students. The response pattern of the student is said to be as expected by Rasch model if the value obtained for Point Measure Correlation is within the provided range (as in Table II).

Any item that does not meet these criteria will be eliminated. The analysis shows that all of the point-measure correlations were above 0.32. This supports the notion that all items actually measure a single construct [19].

Fit statistics was used to control the quality of items in the instrument by removing the items that did not fit into the Rasch Model. There are two types of fit statistics; infit and outfit. The infit statistics (weighted) report patterns of responses to items targeted on the person while outfit statistics (un-weighted) give the response pattern to items with difficulty far from a person. The value of infit and outfit statistics are reported as MnSq (mean of the squared residuals) and standardized Zstd (z-standard) to show the size of randomness in the measurement.

The recommended range for infit/outfit mean square and z-standard is shown in Table II. This range was used to identify the misfit items in the model. Misfit items usually have variance in the response pattern. These misfit items usually lead to overfitting or underfitting to the model. The overfitting items have a lower mean square and z-standard value. These items also have predictable responses and are too good to be true, while underfitting items have a higher value of mean square and z-standard value with unpredictable responses. Hence, if any item with a poor fit statistics value was identified, it was considered for removal from the instrument.

A. Item Quality The item removal process was done to maximize the

quality of the items in the instrument. The point measure correlation and fit statistics were used as criteria for item removal.

Table III displays all the items in the order of a decreasing level of difficulty to endorse the items, the point measure correlation and fit statistics. The response patterns for all the items were as expected by the Rasch Model. This can be seen by looking at the values of the item point measure correlation. All the values shown in the point measure correlation was within the range.

There were four items with values out of the expected fit statistics range. The items were OCD4, IUODS1, UID2 and PI4. Item OCD4, IUODS1 and PI4 which were identified as items over fitting the model. The responses for these 3 items

were overly predictable from the other responses and were not able to provide any new information.

TABLE III. ITEM ORDER, POINT MEASURE CORRELATION ANDFIT STATISTICS

Items Difficulty(logit)

Point Measure Correlations

Infit Statistics Outfit StatisticsMnSq Zstd MnSq Zstd

UID1 0.78 0.51 0.97 -0.10 1.04 0.30PEOU3 0.71 0.55 1.21 1.20 1.14 0.60OCD4 0.71 0.73 0.65 -2.20 0.52 -2.50OCD3 0.63 0.45 1.12 0.70 0.96 -0.10UID3 0.63 0.67 1.14 0.80 1.08 0.40

PEOU1 0.56 0.67 1.22 1.20 1.12 0.60PI3 0.49 0.60 1.38 1.90 1.40 1.60

POLE3 0.33 0.50 1.21 1.10 1.22 0.90IUODS1 0.26 0.73 0.65 -2.00 0.53 -2.20

PI1 0.18 0.62 1.24 1.20 1.21 0.90PEOU2 0.10 0.69 1.25 1.20 1.28 1.10

PI2 0.10 0.62 0.78 -1.10 0.74 -1.00PEOU4 0.02 0.68 0.77 -1.20 0.67 -1.40

PU4 -0.07 0.62 0.73 -1.40 0.65 -1.50POLE1 -0.15 0.66 0.80 -0.90 0.73 -1.10UID2 -0.23 0.59 1.62 2.50 1.62 2.10PU1 -0.32 0.71 0.69 -1.60 0.61 -1.70

OCD2 -0.32 0.54 0.93 -0.30 1.29 1.10PU2 -0.40 0.65 1.11 0.60 1.04 0.30

IUODS2 -0.66 0.68 0.97 -0.10 0.85 -0.50PU3 -0.66 0.71 0.76 -1.10 0.70 -1.10

POLE2 -0.75 0.59 1.18 0.80 1.23 0.90PI4 -0.93 0.74 0.51 -2.70 0.42 -2.60

OCD1 -1.01 0.54 0.82 -0.80 0.88 -0.30Note: MnSq = Mean Square, Zstd = Z-Standard

Item UID2 was recognized as an underfit item because it was out of the expected fit statistics range. Item UID2 had a high mean square and z-standard value (Table III). Hence it had a too unpredictable response pattern. It is confirmed by looking at the scalogram for item UID2. An unexpected response pattern was noticed for item UID2 relative to the person’s ability. Students with higher agreeability were more difficult to endorse item UID2 when students with lower agreeability easily endorse item UID2.

This proves that UID2 is an underfitting item and needs to be eliminated from the instrument for successful implementation of the Rasch measurement.

The Rasch analysis was used for validation of item to maximize the item quality used in studying students’ intention to use the ODS. Through the analysis, item UID2 was identified as a misfit (underfitting item) and removed from the instrument. In conclusion there were 24 items in the original instrument [1] and after removal of one misfit (UID2) the final instrument consisted of 23 items.

B. Intrument Fit Summary statistics will provide information on

instrument fit. It will show whether the data obtained from the analysis fits as expected by the Rasch Model. Item reliability results were used to identify if there was any occurrence of replicability if the items in the instrument were tested with another sample of the same size [17]. Item reliability after removal of item UID2 was increased to 0.71 (see Table IV) and it shows that there replicability in the instrument.

The expected infit and outfit mean square value for the mean item was estimated at 0.99 and 0.95 respectively. Both the values are very close to 1, within the expected range as

477

Page 4: [IEEE 2013 International Conference on Advanced Computer Science Applications and Technologies (ACSAT) - Kuching, Malaysia (2013.12.23-2013.12.24)] 2013 International Conference on

shown in Table II. While the infit and outfit Z-standard values were found to be at -0.10 and -0.20 which is within the normality range of -2<Z<+2.

The item mean is always set at 0 logit for calibration purposes. Any person who obtained a value of 0 logit location is a person with 50:50 chance to successfully endorse the item based on his/her ability. Thus, the overall finding indicates that the instrument has a fair fit with item reliability of 0.71 [20].

TABLE IV. SUMMARY STATISTICSStatistics Measures (logits)

Before Item Removal

After Item Removal

Mean Item 0.00 0.0Person 2.14 2.16

Reliability Item 0.69 0.71Person 0.90 0.90

Mean Infit MnSq

Item 0.99 0.99Person 0.95 0.95

Mean Outfit MnSq

Item 0.96 0.95Person 0.96 0.95

Mean Infit Zstd

Item -0.10 -0.10Person -0.30 -0.30

Mean Outfit Zstd

Item -0.20 -0.20Person -0.40 -0.30

Note: MnSq = Mean Square, Zstd = Z-Standard

V. CONCLUSION

The current study demonstrated evidence that the 23 item instrument is a valid measurement for identifying students’intention to use the ODS and all the items measure a single construct. One item was identified as a misfit and was eliminated. Therefore the final instrument contains only 23 items. Thus further studies will investigate the 23 items for understanding factors that can assist universities in improving their ODS for teaching and learning purposes.

REFERENCES

[1] I.-F. Liu, M. C. Chen, Y. S. Sun, D. Wible, and C.-H. Kuo, “Extending the TAM model to explore the factors that affect Intention to Use an Online Learning Community,” Computers & Education, vol. 54, no. 2, pp. 600–610, 2010.

[2] F. D. Davis, R. P. Bagozzi, and P. R. Warshaw, “User acceptance of computer technology: a comparison of two theoretical models,” Management science, vol. 35, no. 8, pp. 982–1003, 1989.

[3] F. D. Davis, “A technology acceptance model for empirically testing new end-user information systems: Theory and results.” Massachusetts Institute of Technology, 1985.

[4] S.-H. Liu, H.-L. Liao, and C.-J. Peng, “Applying The Technology Acceptance Model and Flow Theory to Online e-Learning Users’ Acceptance Behavior,” E-learning, vol. 4, no. H6, p. H8, 2005.

[5] S. Y. Park, “An analysis of the technology acceptance model in understanding university students’ behavioral intention to use e-learning,” Educational Technology & Society, vol. 12, no. 3, pp. 150–162, 2009.

[6] J. C. Roca, C.-M. Chiu, and F. J. Martínez, “Understanding e-learning continuance intention: An extension of the Technology Acceptance Model,” International Journal of Human-Computer Studies, vol. 64, no. 8, pp. 683–696, 2006.

[7] M. Masrom, “Technology acceptance model and e-learning,” 2007.

[8] C.-S. Ong and J.-Y. Lai, “Gender differences in perceptions and relationships among dominants of e-learning acceptance,” Computers in Human Behavior, vol. 22, no. 5, pp. 816–829, 2006.

[9] R. G. Saadé, F. Nebebe, and W. Tan, “Viability of the’Technology Acceptance Model' in Multimedia Learning Environments: A Comparative Study,” Interdisciplinary Journal of Knowledge and Learning Objects, vol. 3, no. 2, pp. 175–184, 2007.

[10] E. M. Van Raaij and J. J. L. Schepers, “The acceptance and use of a virtual learning environment in China,” Computers & Education,vol. 50, no. 3, pp. 838–852, 2008.

[11] E. W. T. Ngai, J. K. L. Poon, and Y. H. C. Chan, “Empirical examination of the adoption of WebCT using TAM,” Computers & Education, vol. 48, no. 2, pp. 250–267, 2007.

[12] K. A. Pituch and Y. Lee, “The influence of system characteristics on e-learning use,” Computers & Education, vol. 47, no. 2, pp. 222–244, 2006.

[13] B.-C. Lee, J.-O. Yoon, and I. Lee, “Learners’ acceptance of e-learning in South Korea: Theories and results,” Computers & Education, vol. 53, no. 4, pp. 1320–1329, 2009.

[14] M. J. Sanchez-Franco, “WebCT–The quasimoderating effect of perceived affective quality on an extending Technology Acceptance Model,” Computers & Education, vol. 54, no. 1, pp. 37–46, 2010.

[15] Y.-C. Lee, “An empirical investigation into factors influencing the adoption of an e-learning system,” Online Information Review, vol. 30, no. 5, pp. 517–541, 2006.

[16] S.-S. Liaw, “Investigating students’ perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the Blackboard system,” Computers & Education, vol. 51, no. 2, pp. 864–873, 2008.

[17] T. Bond and C. Fox, Applying the Rasch model: Fundamental measurement in the human sciences. Lawrence Erlbaum, 2007.

[18] E. V Smith Jr, “Evidence for the reliability of measures and validity of measure interpretation: a Rasch measurement perspective.,” Journal of Applied Measurement, 2001.

[19] M. L. Finlayson, E. W. Peterson, K. A. Fujimoto, and M. A. Plow, “Rasch Validation of the Falls Prevention Strategies Survey,” Archives of physical medicine and rehabilitation, vol. 90, no. 12. W.B. Saunders, pp. 2039–2046, 01-Dec-2009.

[20] W. P. J. Fisher, “Rating Scale Instrument Quality Criteria,” Rasch Measurement Transactions, vol. 21, no. 1, p. 1095, 2007.

478