20
EDIT 6230 Evaluation Report Brett Copeland Josh Miller Fall 2012

josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

EDIT 6230 Evaluation Report

Brett CopelandJosh Miller

Fall 2012

Page 2: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

Table of Contents

Introduction and Background… 3-5

Purposes and Stakeholders… 5

Decisions and Questions… 5-6

Methods… 6

Instrumentation… 6

Limitations… 7

Analysis… 7

Results… 7-9

Discussion… 9-10

Recommendations…10

Appendix A: Pre- and Post-test Questions

Appendix B: User Questionnaire

Appendix C: Focus Group Questions

Page 3: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

INTRODUCTION

This evaluation plan describes the strategies that will be used to evaluate the online training library, Lynda.com. This evaluation was developed by Brett Copeland and Joshua Miller as a requirement for EDIT 6230, Software Evaluation and Curriculum Integration at Georgia College. The background, purposes, limitations, proposed audiences, decisions, questions, methods, sample, instrumentation, logistics, time line, and budget of the evaluation are all included in this document.

BACKGROUND

Lynda.com is an online training library that has over 1,470 online courses and 82,000 tutorials that are accessible at anytime from your computer or mobile device, with Lynda’s iPhone and iPad apps. The courses range from tutorials on types of software programs to instructional videos on job skills, like interviews and salary negotiations, to videos on hobbies like photography.

Once logged in, the user interface is much like YouTube. The user is met with a queue of saved videos and the most recently released videos.

3

Page 4: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

After selecting a course to take, the user is met with a table of contents, description of the course, and how long it will take to watch all of the tutorials associated with the course. There are also exercise files available for download to help the user practice while watching the tutorials.

The videos are much like those found on YouTube or any other online video website, with the ability to pause, rewind, and move to any point in the video if review is desired.

4

Page 5: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

The tutorial being highlighted in this evaluation, entitled “Acing Your Interview,” covers job interview skills. Our client is the Instructional Technology Center at Georgia College. The tutorial will be completed by student workers of the Instructional Technology Center at Georgia College during their assigned shift and while attending to their normal duties.

PURPOSES

The purpose of this evaluation is to provide the staff of the Instructional Technology Center with data to help them decide if using this instructional video with the student workers while they are ‘on the clock’ is effective. Student disposition to the video training, frequency and duration of interruptions, and content assimilation will all be assessed.

This evaluation is summative in the fact that it will not be used to decide whether or not Lynda.com itself is effective, but whether or not using the tutorial while dealing with the possible distractions of the student worker duties at the ITC produces significant gains in knowledge.

STAKEHOLDERS

The primary stakeholders of this evaluation will Daniel McDonald, Division Technical Associate at the Instructional Technology Center (ITC) and the student workers of the ITC. Relevant secondary stakeholders include additional staff at the ITC and the greater University community.

The designers and implementers of this evaluation plan Brett Copeland and Joshua Miller, students enrolled in Software Evaluation and Curriculum Integration, with

5

Page 6: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

the guidance of the course instructor, Dr. Eunjung Oh of the Foundations, Secondary Educations, and Educational Technology Department, John H. Lounsbury College of Education.

DECISIONS

Should this training video be used with student workers? Is this training video suitable for use when student workers have many

distractions?

QUESTIONS

Effectiveness Does the student worker feel they have learned anything? Does the student worker show increased knowledge based on pre-

and post- test responses?

Acceptability Does the student worker enjoy using the instructional video?

Suitability for Setting How much time does the student work spend “off task” between

initiating and completing the instructional video?

METHODS

Multiple methods of data collection will be used to answer these questions. These methods include:

• User pre- and post- tests• User questionnaires, including clock time reporting of use

MethodEvaluation Question Areas

Effectiveness Acceptability SuitabilityPre- and Post Tests xxUser Questionnaires xx xx xx

INSTRUMENTATION

The instruments used to gather data for the evaluation include:

6

Page 7: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

• User pre-test• User post-test• User questionnaire, including clock time reporting of use

The pre- and post- tests have ten multiple-choice questions with content derived from the instructional video content. The users will answer the ten questions prior to viewing the instructional video and again after viewing the video. Responses to the pre- and post- test will be compared.

The user questionnaire will be administered following the post- test. The user questionnaire will include an item requesting clock times for the initiation of the first part of the instructional video and the final completion of the same.

LIMITATIONS

The small sample size for this evaluation is an important limiting factor. While the diverse courses of study, backgrounds, and experiences of the group provide an interesting cross section they do not represent a statistically significant sample of the body of student workers at the ITC. It is possible that some student workers will have experience in the area of salary negotiation, or do not see the relevance of the topic to their lives, and may present with different interest levels that can affect performance.

None of the instruments have been tested for validity and reliability due to time constraints of the program. Persons knowledgeable in the field of survey construction and focus group management will review the instruments.

The heavy reliance of self-reporting tools for this evaluation should be noted. Self-reporting tools, particularly in settings where the employer encourages participation, can be less reliable than direct assessment tools such as observation and performance based evaluation.

ANALYSIS

Data from the pre- and post-test and from the survey following the participant’s post-test were analyzed in several ways. First, using the username created in the pre-test, answers to questions in the pre- and post-test were compared against the correct answers to the questions to determine a percentage of questions answered correctly for each test. Then, a percentage gain/loss was calculated from the pre-test to post-test to assess any knowledge gains from viewing the instructional video.Clock times entered by participants during the survey for the initiation and completion of the instructional video were used to calculate the number of minutes spent on the instructional video.

7

Page 8: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

Using the username created in the pre-test and used by the participant for the duration of the evaluation, data from the survey is linked to the gains/losses from the pre- and post-test calculation described above.

Data entered by participants who did not complete all three components (pre-test, post –test, and survey) were removed from the sample.

RESULTS

Five student workers completed the pre-test, post-test, and survey. Of the five student workers that completed all the three components there were three females and two males, all 17-24 years of age. All participants indicated they had several job interviews in the past and responded their confidence level had somewhat increased from viewing the instructional video. The overarching theme from opened ended responses of why participants felt more confident: good suggestions and tips that the participant had not considered.

Responses on the pre-test suggest the majority of the participants have a some understanding of the content presented before viewing the video, with the mean score of 64%. Gains in knowledge from the pre-test to post-test vary greatly by participant, but the mean score increased significantly to 84% correct responses (see Figure 1).

Participant Pre-test score (%) Post-test score (%) Percent Gain (%)

A 70 100 43B 70 90 29C 30 60 100D 80 80 0E 70 90 29

Figure 1 – Pre-test and Post-test results with Percent Gain by Participant

Participants reported widely divergent amounts of distraction and interruption ranging from never distracted or interrupted to often distracted or interrupted (see Figure 2). Those reporting being distracted or interrupted sometimes or often are more likely to report greater durations of interruption and distraction (see Figure 3).

8

Page 9: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

1

2

1

1

NeverRarelySometimesOften

Never

Rarely

Som

etim

esOfte

n0

1

2

1-5 seconds6-15 Seconds

Figure 2 – Frequency of Figure 3 – Correlation between Interruption/Distraction Frequency and Duration of

Interruption/Distraction

There is neither clear correlation between those who reported less frequent distractions and interruptions and time elapsed for viewing the instructional video, nor between less frequent distractions and interruptions and percentage gain on the post-test.

There does appear to be a connection between performance on the pre- and post- tests and time of day attempted (see Figure 4). While the participant who completed the video the latest in the day showed the most significant percentage gain of all participants this participant’s post-test score is lower than all others’ pre-test score.

Participant Pre-test score (%) Post-test score (%) Time Reported Completing Video

A 70 100 8:03AMB 70 90 1:39PMC 30 60 12:45AMD 80 80 9:28PME 70 90 9:17AM

Figure 4 – Pre-test and Post-test Results with Completion Time (emphasis added)

Data to draw a correlation between performance on the post-test suggesting mastery of the content of the video and the “time off task” measure made by subtracting the shortest amount of time spent on the video from all other viewers is inconclusive (see Figure 5). While the outlier of the group, participant C, spends a significantly greater amount of time on the video than the others, the score of the post-test shows a mediocre assimilation of the content of the video.

Participant

Pre-test score (%)

Post-test score (%)

Time Elapsed to Complete the Video

Time greater than lowest time elapsed

A 70 100 26 minutes 0 minutes

9

Page 10: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

B 70 90 26 minutes 0 minutesC 30 60 45 minutes 19 minutesD 80 80 27 minutes 1 minuteE 70 90 32 minutes 6 minutes

Figure 5 – Pre-test and Post-test Results with Time Elapsed to Complete Video

Overall, the participants reported enjoyability ranging from Neutral to Rather Enjoyable, with an mean enjoyability rating of 6.40 out of 7. Themes that positively influenced this rating: helpfulness of the tips and relevance of material.

DISCUSSION

The discussion is constructed in such a way to provide relevant information to the questions posed by the evaluation.

It is clear that the student workers feel they gained something from watching the video on job interview skills, both from the rating scale and the themes of their comments. Knowledge gains from the pre-test to post-test suggest the student workers’ feeling is valid, with most participants improving their scores on the post-test.

The student workers find the video enjoyable as a group, with the lowest individual ratings being neutral. Neutral comment themes are centered around the video not being all that bad, while higher ratings focus on the information gained from the video and appreciation for tips and ideas not considered before viewing the video.The video’s suitability for the setting is a more complex issue. Those student workers who reported fewer and shorter duration distractions and interruptions showed a greater propensity to assimilate more information as measured by the pre- and post-tests than those with longer and more frequent distractions and interruptions. Also, the clock time of viewing the video may be a factor, but with only one respondent being a late night respondent this conclusion may not be drawn.

With a fifty percent response from our possible sample of respondents, our sample is fairly typical for web survey response. We would be remiss if we did not note the limitations of this evaluation due to the small sample size and outliers present in the data set.

It should also be noted that the data from three other respondents was thrown out since they completed only two of the three required components for the evaluation.

RECOMMENDATIONS

Given that student workers feel more confident and show content gains after viewing the instructional video, the use of the video during paid student worker time appears to a good use of resources. Student workers should be allowed to

10

Page 11: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

decided when to complete the instructional within a general timeframe so they can avoid times when they will be most distracted or interrupted.

If the ITC wishes to study this topic further we recommend: Seek out those respondents who did not complete all three

components to identify barriers to completion to increase response rate.

Conduct focus groups to gather richer information on the acceptability of the instructional video.

Seeing that student workers may have been preoccupied with the end of the semester in their own courses, timing the study towards the beginning or middle of the semester may improve response rate as well.

11

Page 12: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

Appendix A Pre- and Post- Test Questions

1. What type of phone is best for phone interviews?a. Droidb. iPhonec. Pay Phoned. Land-line Phone

2. What is NOT one of the non-verbal behaviors to focus on in an in-person interview?a. eye contactb. type of handshakec. tone of voiced. body language

3. In an informal interview setting, what should you remember to ask for?a. business cardsb. e-mail addressesc. phone numbersd. mailing address

4. What is NOT one of the question types in a job interview?a. Behavioralb. Situationalc. Family historyd. Resume

5. When interviewing, what questions should you ask the interviewer?a. The companyb. Position historyc. Job responsibilitiesd. All of the above

6. What is one part of a job posting that would be helpful to study for the interview?a. Salaryb. Job Descriptionc. Minimum Requirementsd. Relocation Stipend

7. What is an acronym to help remember how to prepare your answers?a. SARb. ABCc. PREd. POS

8. In making a first impression, what should be your main focus?a. Introductionb. Dress Codec. Making sure cell phone is offd. Firm Handshake

9. What is an example of an illegal question in an interview?a. How old are you?b. Where were you born?c. What was your GPA?d. What is your native tongue?

10. What should you do when you haven’t heard any response after an interview?a. Call dailyb. An e-mailc. Go back to the person who is hiring’s officed. None of the above

Pre- and Post- Test Questions will be delivered using surveymonkey.com

Page 13: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

Appendix B User Questionnaire

User Questionnaire will be delivered using surveymonkey.com using the format displayed.

Page 14: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

Appendix B User Questionnaire

User Questionnaire will be delivered using surveymonkey.com using the format displayed.

Page 15: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

Appendix B User Questionnaire

User Questionnaire will be delivered using surveymonkey.com using the format displayed.

Page 16: josh-miller.weebly.com · Web viewUser Questionnaire will be delivered using surveymonkey.com using the format displayed. 2 3 EDIT 6230 Evaluation Report Brett Copeland Josh Miller

Appendix C Focus Group Questions

Focus Group Script

We want to thank you for taking the time to participate in our focus group on the negotiating salaries video from Lynda.com. I’m Brett Copeland and this is Josh Miller. We’re both students in a course on evaluating e-learning. I will be moderating today, and Josh will be taking some notes. It’s important to remember that your participation today is strictly voluntary; you can withdraw at any time, for any reason.

Your thoughts are important to us. We want to know what you really think, so please feel free to tell us anything you are thinking.

We will be asking you some questions today about your experience with the Lynda.com video on job interview skills. The questions will relate to your experiences, so there are no right or wrong answers. Again, you can withdraw from the focus group at anytime for any reason.

QUESTIONS

1. What is your opinion of the instructional video you viewed?2. What experience do you have job interviews?3. How did viewing this video affect your confidence about interviewing for a job?4. What distractions did you encounter when viewing this video?