28
1 QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions of formative and summative assessments (direct and indirect) undertaken within each of the three main QEP initiative areas: Curriculum Enhancement, Student Support, and Faculty Support. Additionally, reports from the first round of post-implementation, direct assessment of writing samples from writing-intensive courses are provided, along with comparisons, where possible, to reports from baseline direct assessment of writing samples from writing-intensive courses. Within each section of the report, actions taken as a result of previous QEP assessments are reviewed (where applicable) and their impact, as determined through assessments this year, are discussed. Also noted within each section are areas for improvement, as revealed through 2016-17 assessments, and actions planned in response. The report concludes with an overview of assessments planned for the 2017-18 year, the final year of the QEP. 2. Direct Assessment of Student Writing 2.1 Results and Discussion of Post-QEP Assessment of Writing Samples from WI Courses Summer 2017 marked the beginning of post-implementation WI assessment for the QEP. In consultation with IPAR, student-writing projects were sampled from University Writing Portfolios in the following programs: 1. Addiction and Rehabilitation Studies 2. Communication Sciences and Disorders 3. Computer Science 4. Construction Management 5. Economics 6. Elementary and Middle Grades Education 7. Engineering 8. English 9. Literacy Studies, English Education, and History Education 10. Health Education and Promotion 11. Health Services and Information Management 12. History 13. Hospitality Leadership 14. Human Development and Family Science 15. Industrial Technology 16. Kinesiology 17. Management 18. Nutrition 19. Philosophy and Religious Studies 20. Political Science 21. Psychology 22. Recreation and Leisure Studies 23. Spanish 24. Special Education, Foundations, and Research As detailed below, the results of these post-implementation assessments, when compared to results of baseline assessments of writing samples collected from before the QEP began, suggest that QEP initiatives have had an overall positive, if uneven, impact on students' writing across diverse programs and courses. These first post- implementation results also confirm a suspicion noted in previous (baseline-focused) QEP assessment reports: that initial goals for determining the success of the QEP were overly ambitious. In the QEP, we stated the post- implementation goal as "80 percent of scores on a four-point scale will be a 3 or 4 and no more than 5 percent will be at 1." While not terribly far from the goal for the percent of scores at 1 (post-implementation scores of 1 are at or below 10% in all but 3 instances, with 24 instances at 0%), results for scores at 3 or 4 are notably below the goal articulated in the QEP. Of course, myriad variables beyond the QEP initiatives come into play in students' performance on writing tasks over time: curricula, faculty, and student populations have changed over

2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

  • Upload
    others

  • View
    2

  • Download
    0

Embed Size (px)

Citation preview

Page 1: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

1

QEP Assessment Report 2016-17 1. Introduction

Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions of formative and summative assessments (direct and indirect) undertaken within each of the three main QEP initiative areas: Curriculum Enhancement, Student Support, and Faculty Support. Additionally, reports from the first round of post-implementation, direct assessment of writing samples from writing-intensive courses are provided, along with comparisons, where possible, to reports from baseline direct assessment of writing samples from writing-intensive courses.

Within each section of the report, actions taken as a result of previous QEP assessments are reviewed (where applicable) and their impact, as determined through assessments this year, are discussed. Also noted within each section are areas for improvement, as revealed through 2016-17 assessments, and actions planned in response. The report concludes with an overview of assessments planned for the 2017-18 year, the final year of the QEP.

2. Direct Assessment of Student Writing 2.1 Results and Discussion of Post-QEP Assessment of Writing Samples from WI Courses

Summer 2017 marked the beginning of post-implementation WI assessment for the QEP. In consultation with IPAR, student-writing projects were sampled from University Writing Portfolios in the following programs:

1. Addiction and Rehabilitation Studies 2. Communication Sciences and Disorders 3. Computer Science 4. Construction Management 5. Economics 6. Elementary and Middle Grades Education 7. Engineering 8. English 9. Literacy Studies, English Education, and History Education 10. Health Education and Promotion 11. Health Services and Information Management 12. History 13. Hospitality Leadership 14. Human Development and Family Science 15. Industrial Technology 16. Kinesiology 17. Management 18. Nutrition 19. Philosophy and Religious Studies 20. Political Science 21. Psychology 22. Recreation and Leisure Studies 23. Spanish 24. Special Education, Foundations, and Research

As detailed below, the results of these post-implementation assessments, when compared to results of baseline assessments of writing samples collected from before the QEP began, suggest that QEP initiatives have had an overall positive, if uneven, impact on students' writing across diverse programs and courses. These first post-implementation results also confirm a suspicion noted in previous (baseline-focused) QEP assessment reports: that initial goals for determining the success of the QEP were overly ambitious. In the QEP, we stated the post-implementation goal as "80 percent of scores on a four-point scale will be a 3 or 4 and no more than 5 percent will be at 1." While not terribly far from the goal for the percent of scores at 1 (post-implementation scores of 1 are at or below 10% in all but 3 instances, with 24 instances at 0%), results for scores at 3 or 4 are notably below the goal articulated in the QEP. Of course, myriad variables beyond the QEP initiatives come into play in students' performance on writing tasks over time: curricula, faculty, and student populations have changed over

Page 2: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

2

the 5-year period of the QEP. Thus, while the goal of 80% of scores ≥ 3 has not been met, the upward trends in many of the SLO areas, particularly when considered in conjunction with the other positive impacts detailed in the Student and Faculty Support sections of this report, speak to the significant, positive impact of the QEP.

2.1.1 Assessment Process

During summer 2017, writing samples from the programs listed above were assessed using a rubric derived from the QEP SLOs. Those SLOs are as follows:

At the conclusion of their undergraduate degree programs, ECU graduates will be able to

SLO 1. Use writing to investigate complex, relevant topics and address significant questions through engagement with and effective use of credible sources.

SLO 2. Produce writing that reflects an awareness of context, purpose, and audience, particularly within the written genres (including genres that integrate writing with visuals, audio or other multimodal components) of their major disciplines and/or career fields.

SLO 3. Demonstrate that they understand writing as a process that can be made more effective through drafting and revision.

SLO 4. Proofread and edit their own writing, avoiding grammatical and mechanical errors.

SLO 5. Assess and explain the major choices that they make in their writing.

In the process of developing and applying the rubric during the first baseline assessments in the summer of 2012, it became clear that SLO 1 needed to be broken into two separate parts for meaningful scoring. As a result, the rubric includes SLO 1a, which focuses on students' abilities to use writing to investigate complex, relevant topics and address significant questions, and SLO 1b, which focuses on students' abilities to engage with and effectively use credible sources in their writing. A copy of the rubric, along with the questions provided to assessors to help them more consistently apply the criteria in the rubric, is available in Appendix A.

Prior to actual scoring, norming was conducted by the QEP Director and the Director of the University Writing Program. Following norming, each writing sample in the area was read and scored by two readers, one of whom was a faculty member in the program or college from which the samples were drawn. In the case of substantial disagreement (more than one point) in scores in any of the six rubric categories, a third reader reviewed the sample. Re-norming was conducted if, during this process, assessors consistently diverged significantly in one of the score areas.

2.1.2 Limitations

Some limitations of the assessment process should be kept in mind when reviewing the results:

• Because scores for SLO 3 and SLO 5 were based on students’ responses to the “Writing Self-analysis Questions,” these scores reflect “indirect” assessment: in other words, scores were based on what students said about their writing processes and their revision choices rather than on a direct observation of their writing processes and revision choices. It is possible that students have, as part of their coursework and assignments, written multiple drafts, revised their writing significantly, and/or made informed and accurate assessments of the strengths and weaknesses of their own writing, but their abilities in these areas are not reflected in the assessment results because they have not responded fully or effectively to the self-analysis questions.

• The number of samples assessed varied across colleges and programs. Ideally, the number of writing samples included from each program would be proportional to the number of graduates from that program; however, this was not possible for several reasons. First, smaller programs and programs with only one or two WI courses could not provide as many writing samples as programs with multiple WI courses. Second, samples submitted varied considerably in length. Finally, assessments had to be completed for multiple programs in the limited time frame of a summer session.  

Page 3: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

3

• College-level reorganizations, specifically the redistribution of programs in the former College of Human Ecology, have complicated pre and post-QEP comparisons. The impacts of those reorganizations on reporting-by-college are addressed below.  

 2.1.3 Post-implementation Results by College

In keeping with previous direct assessment reporting, results are reported by college. Results tables include the percentage of scores at three or higher and the percentage of those at one because the criteria for success for post-QEP implementation, as articulated in the QEP document, stipulates that "80 percent of scores on a four-point scale will be a 3 or 4 and no more than 5 percent will be at 1."

Individual programs will receive separate results for their programs and will have the option to meet with the Director of the University Writing Program and/or the QEP Director to discuss results and to review initiatives currently in place to help raise scores.

Thomas Harriot College of Arts and Sciences (partial)

The sample included 158 writing samples randomly selected from 2000, 3000, and 4000-level WI courses in the following programs within HCAS: ENGL, ECON, HIST, PHIL/RELI, POLI, and PSYC.

SLO   Mean   Standard  Deviation  

Percent  ≥  3   Percent  @  1  

SLO  1a   2.63   0.58   43%   1%  SLO  1b   2.61   0.66   44%   2%  SLO  2   2.51   0.66   37%   1%  SLO  3   2.27   0.83   30%   13%  SLO  4   2.55   0.75   43%   4%  SLO  5   2.40   0.80   35%   10%  

Table 1: Harriot College Post-Implementation Results (Partial)

A comparison with baseline results from Harriot College is not included here because samples from several programs are not scheduled for assessment until summer 2018.  

College of Engineering and Technology

The sample included 62 writing samples randomly selected from 3000 and 4000-level WI courses in the College of Engineering and Technology.

SLO   Mean   Standard  Deviation  

Percent  ≥  3   Percent  @  1  

SLO  1a   2.73   0.57   50%   0%  SLO  1b   2.35   0.61   29%   4%  

SLO  2   2.65   0.53   45%   2%  SLO  3   2.39   0.62   20%   2%  SLO  4   2.45   0.63   32%   6%  SLO  5   2.27   0.61   20%   10%  

Table 2: College of Engineering and Technology Post-Implementation Results

Comparison of Baseline and Post-implementation Assessment Scores

Baseline n=69 Post-implementation n=62 Significant differences, as determined by a t-test, are indicated in red.

Page 4: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

4

SLO   Baseline    Mean  

Post    Mean  

  Baseline  Percent  ≥  3  

Post  Percent  ≥  3  

  Baseline    Percent  @  1  

Post    Percent    @  1  

SLO  1a   2.23   2.73     20%   50%     10%   0%  SLO  1b     1.90   2.35     8%   29%     20%   4%  SLO  2   2.26   2.65     19%   45%     6%   2%  SLO  3   1.78   2.39     11%   20%     27%   2%  SLO  4   2.33   2.45     29%   32%     4%   6%  SLO  5   2.00   2.27     10%   20%     8%   10%  

Table 3: College of Engineering and Technology Baseline and Post-Implementation Results

College of Allied Health Sciences

The sample included 38 writing samples randomly selected from 3000 and 4000-level WI courses in the College of Allied Health Sciences.

No samples from Clinical Laboratory Sciences (CLSC) were included in this assessment because there were no CLSC samples available to include in the baseline assessment. Samples gathered from CLSC post-implementation (over academic years 2016-17 and 2017-18) will be assessed separately, in summer 2018, with results reported to the CLSC program.

The results reported below also do not include samples from NUTR because NUTR was not part of the College of Allied Health Sciences at the time that the baseline assessments were completed. NUTR samples were, however, assessed in the summer of 2017, and the program will receive a report of their results separately.

Additionally, fewer post-implementation samples were available from ADRE courses than desired due to the number of students who completed ENGL 1200 rather than ENGL 2201. Additional samples from ADRE courses will be pulled and scored during the summer 2018 assessment and the baseline/post-implementation comparison for the College of Allied Health Sciences will be updated with those additional scores. ADRE will receive their individual program report after these additional samples have been scored.

SLO   Mean   Standard  Deviation  

Percent  ≥  3   Percent  @  1  

SLO  1a   2.82   0.78   53%   0%  SLO  1b   2.82   0.74   50%   0%  SLO  2   2.59   0.88   39%   8%  SLO  3   2.54   0.82   41%   3%  SLO  4   2.84   0.70   58%   3%  SLO  5   2.71   0.70   58%   3%  

Table 4: College of Allied Health Sciences Post-Implementation Results

Comparison of Baseline and Post-implementation Assessment Scores

Baseline n=41 Post-implementation n=38 Significant differences are noted in red.

Page 5: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

5

SLO   Baseline    Mean  

Post    Mean  

  Baseline  Percent  ≥  3  

Post  Percent  ≥  

3  

  Baseline    Percent  @  1  

Post    Percent    @  1  

SLO  1a   2.95   2.82     66%   53%     0%   0%  SLO  1b*  1b*  

3.05   2.82     80%   50%     0%   0%  SLO  2   2.80   2.59     51%   39%     0%   8%  SLO  3   2.14   2.54     29%   41%     25%   3%  SLO  4   2.82   2.84     52%   58%     2%   3%  SLO  5   1.64   2.71     8%   58%     32%   3%  

Table 5: College of Allied Health Sciences Baseline and Post-Implementation Results

*In considering the baseline results for SLO 1b, it should be noted that 21 of the samples did not require the use of outside/secondary sources. Because SLO 1b addresses students' ability to use credible sources in an effective manner, the mean and percentages reported above for baseline assessment are based on scores for only 20 samples and are thus difficult to interpret.

College of Health and Human Performance

The sample included 92 writing samples randomly selected from 3000 and 4000-level WI courses in the following programs within CHHP: HDFS, HLTH, KINE, and RCLS.

Samples from SOCW and IDSN/MRCH will be assessed in the summer of 2018 and a report for all programs in the college, along with a comparison that omits data from programs that were in the now defunct College of Human Ecology at the time of baseline assessments, will be provided in the 2017-18 QEP Assessment Report.

SLO   Mean   Standard  Deviation  

Percent  ≥  3   Percent  @  1  

SLO  1a   2.89   0.48   66%   0%  SLO  1b*   2.41   0.76   33%   7%  SLO  2   2.59   0.57   38%   0%  SLO  3   2.43   0.70   30%   8%  SLO  4   2.52   0.73   40%   8%  SLO  5   2.65   0.59   45%   1%  

Table 6: College of Health and Human Performance Post-Implementation Results

*23 of the samples did not require the use of outside sources. Because SLO 1b addresses students' ability to use credible sources in an effective manner, the mean and percentages reported above for post-QEP assessment are based on scores for 69, rather than 92, samples.

College of Education

The sample included 73 writing samples selected from 3000 and 4000-level WI courses in the College of Education.

Page 6: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

6

SLO   Mean   Standard  Deviation  

Percent  ≥  3   Percent  @  1  

SLO  1a   2.84   0.66   55%   0%  SLO  1b   2.74   0.66   30%   0%  SLO  2   2.68   0.62   43%   0%  SLO  3   2.21   0.69   22%   11%  SLO  4   2.52   0.85   36%   1%  SLO  5   2.24   0.72   19%   7%  

Table 7: College of Education Post-Implementation Results

Comparison of Baseline and Post-implementation Assessment Scores

Baseline n=59 Post-implementation n=73

SLO   Baseline    Mean  

Post    Mean  

Baseline  Percent  ≥  

3  

Post  Percent  ≥  

3  

Baseline    Percent  @  

1  

Post    Percent    @  1  

SLO  1a   2.58   2.84   44%   55%   2%   0%  SLO  1b*   2.10   2.74   12%   30%   6%   0%  SLO  2   2.50   2.68   38%   43%   0%   0%  SLO  3   2.02   2.21   18%   22%   19%   11%  SLO  4   2.75   2.52   49%   36%   2%   1%  SLO  5   2.34   2.24   37%   19%   8%   7%  

Table 8: College of Education Baseline and Post-Implementation Results

*In considering the post-implementation results, it should be noted that 29 of the 73 assessed samples did not require the use of outside sources. Because SLO 1b addresses students' ability to use credible sources in an effective manner, the mean and percentages reported above for post-QEP assessment are based on scores for only 44 samples.

College of Business

The sample included 20 writing samples selected from 4000-level WI courses in the College of Business.

The results reported below do not include samples from HOSP because HOSP was not part of the College of Business at the time that the baseline assessments were completed. HOSP samples were, however, assessed in the summer of 2017, and the program will receive a report of their results separately.

SLO   Mean   Standard  Deviation  

Percent  ≥  3   Percent  @  1  

SLO  1a   2.63   0.52   40%   0%  SLO  1b   2.17   0.68   15%   15%  

SLO  2   2.58   0.58   35%   0%  SLO  3   2.13   0.69   21%   16%  SLO  4   2.48   0.49   30%   0%  SLO  5   2.42   0.59   32%   0%  

Table 9: College of Business Post-Implementation Results

Comparison of Baseline and Post-implementation Assessment Scores

Baseline n=36 Post-implementation n=20*

Page 7: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

7

*The lower n for post-implementation assessment reflects the fact that more of the samples were lengthy, semester-long writing projects. As a result, fewer samples could be read and scored in the limited time frame of the assessment. Additional samples will be selected and assessed this summer to increase the post-implementation n. No statistically significant differences (as determined by a t-test) were observed in means between baseline and post-implementation results.

SLO   Baseline    Mean  

Post    Mean  

  Baseline  Percent  ≥  

3  

Post  Percent  ≥  

3  

  Baseline    Percent  @  

1  

Post    Percent    @  1  

SLO  1a   2.94   2.63     61%   40%     0%   0%  SLO  1b**   2.23   2.17     27%   15%     0%   15%  SLO  2   2.38   2.58     22%   35%     3%   0%  SLO  3***   NA   2.13     NA   21%     NA   16%  SLO  4   2.63   2.48     42%   30%     3%   0%  SLO  5***   NA   2.42     NA   32%     NA   0%  

Table  10:  College  of  Business  Baseline  and  Post-­‐Implementation  Results  

**In considering the baseline results for SLO 1b, it should be noted that 25 of the 36 samples did not require the use of secondary/outside sources. Because SLO 1b addresses students' ability to use credible sources in an effective manner, the mean and percentages reported above for baseline assessment are based on scores for only 11 samples and are thus difficult to interpret. ***Because only 5 of the baselines samples included a writing self-analysis, comparison data is not included for SLO 3 and SLO 5.

 

2.1.4 Summary and Discussion of Assessment Data

Among the 4 colleges for which baseline and post-implementation comparisons were possible, changes between baseline and post-implementation mean scores in the SLOs can be summarized as follows:

SLO   #  of  colleges  significant  increase  in  mean  score,  baseline  to    post-­‐implementation  

#  of  colleges  with  increase  in  mean  score  (not  significant),  baseline  to  post-­‐implementation  

#  of  colleges  with  significant  decrease  in  mean  score,  baseline  to  post-­‐implementation  

#  of  colleges  with  decrease  in  mean  score  (not  significant),  baseline  to    post-­‐implementation  

SLO  1a   2   0   0   2  SLO  1b   1   1   0   2  SLO  2   1   2   0   1  SLO  3*   1   2   0   0  SLO  4   0   2   1   1  

SLO  5*   2   0   0   1  TOTALS   7   7   1   7  

Table 11: Changes in Mean Scores across Colleges Assessed in Summer 2017

*One college did not have sufficient baseline data for SLOs 3 and 5; thus, comparison was not possible for these two SLO for that college.

Page 8: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

8

These initial post-implementation and baseline comparisons suggest that QEP initiatives have had an overall positive, if uneven, impact across diverse programs and courses, with 7 instance of significant increases in mean scores in SLO areas, 1 instance of a statistically significant decrease, and 14 instances in which differences, some increases and some decreases, were not statistically significant.

These first post-implementation results also confirm a suspicion noted in previous (baseline-focused) QEP assessment reports: that initial goals for determining the success of the QEP were overly ambitious. In the QEP, we stated the post-implementation goal as "80 percent of scores on a four-point scale will be a 3 or 4 and no more than 5 percent will be at 1." As the tables below indicate, we are not terribly far from the goal for the percent of scores at 1; however, results for scores at 3 or 4 are notably below the goal articulated in the QEP.

SLO   Colleges with more than 10% @1  

Colleges with 5-10% of Scores @1

Colleges with less than 5% of Scores @1  

SLO  1a   0   0   6  SLO  1b   1   1   4  SLO  2   0   1   5  SLO  3*   3   1   2  SLO  4   0   2   4  SLO  5*   0   3   3  TOTALS   4   8   24  

Table 12: Post-implementation Percent of Scores @1 in 6 Colleges Assessed in Summer 2017

SLO   Colleges with less than 20% of Scores ≥ 3  

Colleges with 20-39% of Scores ≥ 3

Colleges with 40-60% of Scores ≥ 3

Colleges with more than 60% of Scores ≥ 3

SLO  1a   0   0   5   1  SLO  1b   1   3   2    SLO  2   0   4   2   0  SLO  3*   0   5   1   0  SLO  4   0   3   3   0  SLO  5*   1   3   2   0  TOTALS   2   18   15   1  

Table 13: Post-implementation Percent of Scores ≥ 3 in the 6 Colleges Assessed in Summer 2017

In interpreting this assessment data, it is important to remember that many key variables beyond the QEP initiatives come into play in students' performance on writing tasks over time: curricula, faculty, and student populations have changed over the 5-year period of the QEP. Thus, while the initial goal of 80% of scores ≥ 3 has not been met, the significant mean increases in several of the SLO areas, particularly when considered in conjunction with the other positive impacts detailed in this and earlier QEP assessment reports, speak to the positive impact of the QEP.

Initial post-implementation assessments also reveal that the area most in need of additional instructional support is SLO 3: Students will demonstrate that they understand writing as a process that can be made more effective through drafting and revision. Percentages of students scoring @1 were highest in this SLO even after the implementation of the QEP. Actions planned for the 2017-18 academic year to address this particular outcome include the following:

Page 9: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

9

• A WLC for the 2017-18 academic year is focusing on "Teaching Writing as a Process." This group will, following the model of earlier WLCs in the QEP, spend the academic year investigating strategies for helping students to recognize and more fully engage with writing as a recursive, revision-heavy practice. The WLC will present materials developed in spring 2018. Subsequently, these materials will be made available on the Writing@ECU website and promoted to faculty and students.

• Liaisons will be encouraged to direct their colleagues and students to the many resources related to writing and revision processes that are already available on the Writing@ECU website.

• As program reviews continue through the WAC Committee of the Faculty Senate, we anticipate that faculty, and hence students, will become more attuned to drafting and revising. As part of the program review process, programs and their faculty will need to consider and articulate how their WI courses help move students toward the University Writing Outcomes (which are very similar to the QEP SLOs), including Outcome 3 about the writing process.

The University Writing Program is also currently working with ITCS to explore online platforms for peer review of writing, including platforms that integrate self-assessment of students' writing and drafting processes. This exploration will include stakeholders from across the university and will necessitate detailed consideration and piloting of multiple platform options. Thus, it will likely not be completed until AY 2018-19, but it is another step the university is taking to help students become more adept at drafting and revising.

SLO 1b, which addresses the latter half of SLO 1—students will demonstrate "engagement with and effective use of credible sources," will also be addressed in response to the fact that one of the colleges included in this summer’s assessment had more than 10% of students scoring @1. Specifically, one of the Writing Liaisons meetings for this academic year will focus on methods and resources for improving students' use of source materials in their writing. The Liaisons will then be expected to take what they've learned back to the faculty in their programs. Additionally, concerted efforts will be made to promote the many existing resources available on this topic on both the Writing@ECU website and the Joyner Library website.

2.1.5 Assessment of the WI Assessment Process—Internal Assessors' Questionnaire

As in past summers, assessors were asked to respond to questions upon completion of the assessment process. Questionnaires were completed by 24 assessors, representing 20 different academic programs. Questions asked included the following:

1. What difference, if any, did you notice between how you read student writing when you are "grading" a piece of writing and when you are doing "assessment"?

2. What did you find interesting/useful in the process of assessing writing from your own and other disciplines? What surprised you, if anything?

3. What, if anything, did you notice about the writing from your own discipline, department, school, or college when compared to the samples you read from other groups of students?

4. What, if anything, did you learn from this experience that might impact your own teaching of writing in the future?

In keeping with previous WI assessment feedback, responses this year reveal how beneficial the process of participating in the assessment can be for faculty. Some themes emerged surrounding the benefits accrued through participation. These themes, along with several representative excerpts from the questionnaires, are enumerated below:

1. Gained Insight into Assignment and/or Curriculum Design—15/24 (62% of responses)

"It was surprising to see…lack of clarity in assignment instructions."

Page 10: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

10

"I found the variation in assignment instructions even within [different sections] of the same course [useful/interesting]."

"[Faculty in the same area] seem to have different expectations in terms of length we expect students to write."

"[I learned] perhaps to design assignments that target specific aspects of writing that may be more meaningful for students beyond the specific class I am teaching."

"I would like a second opinion from the writing intensive faculty to assist in review/revising [our] assignment."

"I enjoyed reading and learning about what others do in their WI classes—some things could be applied in my discipline too."

"I liked to see samples from other courses in my discipline…. I will use the types of papers I have seen as examples."

"The first lesson was to provide an explicit instruction of writing task and the instructor's expectation. Without a guideline, students might have a hard time [putting] their ideas in the form of writing."

2. Recognized Need to Emphasize/Teach Writing as a Process— 9/24 (37% of responses)

"I found this process very beneficial. It obligated me to think about the writing process and how we, as professors, are relaying information to students."

"[I learned of] the importance of requiring submission of different parts of the paper…. I do this in some classes, but see now just how important it is to the students."

"I learned that it is important to sit down with the students and explain the writing process/revision expectations."

"I think I may tweak my class to force more thought and engagement in process."

"I learned that I need to highlight the process of writing at the beginning of the class. I overlooked this task in the past."

3. Recognized Importance of Teaching Reflection/Self-analysis – 5/24 (21%of responses)

"I plan to better prepare my WI students how to answer the self-analysis questions."

"I realized that most students don't really understand the self-analysis component…. I think I can help explain these questions better to hopefully have better writing reflections in the future."

"I found it useful to see 'the end' of uploads and think I can better prepare my learners and produce better future reflections."

"Analyzing [your] own writing must be discussed in the class, so the students understand what is expected from writing intensive courses and why."

Additionally, three respondents noted that they believe that all instructors (or at least all WI instructors) should have the assessment experience:

"I think every professor should experience scoring at least one time. Very helpful."

"I have learned a lot and I hope this process can be shared by all faculty as [a] refresher course in grading and effective teaching of the writing process"

"I think that all professors teaching WI courses should participate in the assessment process. It is really an eye-opening exercise to see these projects as an assessors and not just as the instructor."

Page 11: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

11

It is also noteworthy that two of the programs participating in the summer assessment have already initiated curricular revisions (changing the courses on which the WI designation is applied and/or developing new courses to better ensure that the SLOs are being clearly addressed in the undergraduate curriculum) in response to what was learned in the assessment process.

3. Indirect Assessments of Student Writing: Post-Implementation

3.1 Results and Discussion of UNC/ECU Sophomore Survey—Spring 2017

Six writing-related questions, aligned with QEP SLOs, were included in the sophomore survey beginning in AY 2012-2013. Baseline data from the spring 2013 (baseline) administration and spring 2017 (post-implementation) administration are compared in the table below.

Please  indicate  the  extent  of  your  agreement  or  disagreement  with  the  statements.  

Spring  2013  %  "Strongly  Agree"  or  

"Agree"  n=1608  

Spring  2017  %  "Strongly  Agree"  or  

"Agree"4  n=244  

Writing  about  complicated  or  tricky  topics  and  situations  helps  me  to  think  about  them  

62.7%   54.1%  

I  am  well  prepared  to  write  effectively  in  the  styles  and  formats  of  my  career  field  

69.6%   66.8%  

When  composing  important  documents,  I  often  write  multiple  drafts  

58.6%   52.1%  

I  regularly  take  time  to  proofread  my  writing  before  giving  it  to  others  

75.7%   76.2%  

I  am  confident  in  my  ability  to  avoid  grammatical  errors  in  my  writing  

75.5%   67.6%  

I  am  confident  in  my  ability  to  evaluate  the  quality  of  my  own  writing  

75.5%   71.7%  

Table 14: Sophomore Survey Results for 2013 (Baseline) and 2017 (Post-implementation)

The significance of the differences between the responses to the 2013 and the 2017 are difficult to assess due to the low response rate to the most recent sophomore survey. We hope for a better response rate in spring 2018.

3.2 Results and Discussion of UNC/ECU Graduating Senior Survey—Spring 2017

Six writing-related questions, aligned with QEP SLOs, were included in the graduating senior survey beginning in AY 2012-2013. Baseline data from the spring 2013 administration and post-implementation data from the spring 2017 administration are compared in the table below.

Page 12: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

12

Please  indicate  the  extent  of  your  agreement  or  disagreement  with  the  statements.  

2013  %  "Strongly  Agree"  or  

"Agree"  n=1587  

2017  %  "Strongly  Agree"  or  "Agree"4  

n=1567  

Writing  about  complicated  or  tricky  topics  and  situations  helps  me  to  think  about  them  

72.7%   75.7%  

I  am  well  prepared  to  write  effectively  in  the  styles  and  formats  of  my  career  field  

86.2%   86.1%  

When  composing  important  documents,  I  often  write  multiple  drafts  

63.9%   64.0%  

I  regularly  take  time  to  proofread  my  writing  before  giving  it  to  others  

83.0%   84.4%  

I  am  confident  in  my  ability  to  avoid  grammatical  errors  in  my  writing  

80.7%   81.5%  

I  am  confident  in  my  ability  to  evaluate  the  quality  of  my  own  writing  

84.2%   85.1%  

Table 15: Senior Survey Results for 2013 (Baseline) and 2017 (Post-implementation)

There has been a significant increase in student responses of "agree" and "strongly agree" to the first statement, which is intended to align with QEP SLO 1A. Differences across responses to the other statements do not reflect significant changes, with increases of less than 1.5% in 3 of the remaining statements and a decrease of less that 1% in 2 for the remaining statements.

It should be noted that many students represented in this graduating senior survey did not complete ENGL 2201 or its equivalent but had completed their Writing Foundations courses under the old ENGL 1100/1200 sequence. Of the approximately 1566 students who responded to the prompts related to the QEP outcomes (not all students responded to each prompt), only 118 had completed ENGL 2201. We anticipate the numbers of those who have completed ENGL 2201 to be substantially higher in the next administration of the survey.

4. Curriculum Enhancement Initiatives Implementation Assessment

Several measures were taken during AY 2017-18 to continue ongoing assessment of the University Writing Portfolio and ENGL 2201 implementation processes.

4.1 Results and Discussion of University Writing Portfolio Implementation

4.1.1 WI Syllabi Review

Writing-intensive course syllabi were reviewed by two QEP graduate assistants for mention of the University Writing Portfolio and, further, to determine if the portfolio was required or strongly encouraged. Syllabi review revealed the following:

Page 13: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

13

WI  Syllabus  Portfolio  Mention  

Spring  2015  n=179  

Fall  2015  n=97  

Spring  2016  n=124  

Fall  2016  n=126  

Spring  2017  n=123  

UWPort  Required  for  credit/grade/course  completion.    

24%   56%   64%   77%   79%  

UWPort  mentioned,  but  NOT  REQUIRED  (strongly  encouraged,  for  extra  credit,  etc.)  

44%   1%   1%   0   0  

NO  UWPort  mention   32%   43%   35%   23%   21%  

Table 16: WI Syllabus Review Results for UWPort Mention

The 15% increase from spring 2016 to spring 2017 in syllabi that require the UWPort as part of the class is encouraging. Efforts will continue to ensure that all faculty, particularly newly hired faculty, are aware of the portfolio upload requirement for WI courses.

4.1.2 UWPort Upload Review

The QEP Director, with help from a QEP graduate assistant and the Web Coordinator, also reviewed University Writing Portfolio submission rates for fall 2016 and spring 2017 writing-intensive courses. Those rates, along with rates from previous semesters, are included below:

WI  Course  Submission  Rates  

Fall  2014   Fall  2015   Spring  2016   Fall  2016   Spring  2017  

>  ½  of  students  submitting  

41%   53%   46%   66%   75%  

<  ½  of  students  submitting  

31%   28%   36%   20%   14%  

NO  students  submitting  

28%   19%   18%   15%   11%  

Table 17: UWPort Upload Rates Fall 2014 through Spring 2017

Upload rates continue to rise, reflecting persistent efforts of Writing Liaisons, the WAC Committee of the Faculty Senate, Deans, Directors, and Department Chairs, all of whom have played an important role in spreading awareness of the upload process. Prior to the start of the Fall 2017 semester, in an effort to raise submission rates even higher, the Provost directly contacted Deans in colleges with the lowest submission rates, encouraging them to ensure that faculty teaching WI courses have students upload material.

4.2 Results and Discussion of Enrollment in Discipline-themed ENGL 2201 Sections

As noted in the 2015-2016 QEP Assessment Report, discipline-themed sections of ENGL 2201 often did not enroll students whose declared or intended majors aligned with the discipline-focus of the section. A number of efforts—including better coordination of scheduling and advising within different programs via the Writing Liaisons and the implementation of various major-related enrollment restrictions on the sections in Banner—were implemented during the 2016-17 academic year to address this problem. Enrollments in discipline-themed sections for fall 2017 reflect the positive impact of those efforts, as demonstrated in the table below. Note that majors that might fit into multiple disciplinary areas (for example, Psychology may be considered a social science or a health science, depending on what the student focuses on) were considered "in" the disciplinary theme of the section if that theme could be related to the major.

Page 14: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

14

  Fall  2015   Fall  2017  Discipline  Theme   #  of  Sections   %  in  Discipline   #  of  Sections   %  in  Discipline  

Arts  &  Humanities   3   22%   3   97%  Business   4   68%   4   90%  Communication   3   35%   2   100%  Engineering  &  Technology  

2   61%   5   100%  

Education   4   39%   4   97%  Health  Sciences   3   91%   4   97%  Natural  Sciences   2   62%   4   100%  Social  Sciences   3   47%   3   93%  

Table 18: Enrollment in Discipline-themed ENGL 2201 Sections

4.3 Results and Discussion of ENGL 2201 Instructor Survey

4.3.1 Student Engagement with ENGL 2201 Material and Major/Career Connections

At the conclusion of the spring 2017 semester, faculty teaching ENGL 2201 were asked to complete a brief survey about their perceptions of the effectiveness of the 2201 implementation process and the effectiveness of the course in `helping students move into writing in their major/career areas. Results are presented below, with results from spring 2016 for comparison.

Statement   Spring  2016  %  Somewhat  Agree,  Agree,  or  Strongly  Agree  n=20  

Spring  2017  %  Somewhat  Agree,  Agree,  or  Strongly  Agree  n=19  

The  majority  of  students  in  my  2201  section(s)  seemed  engaged  in  the  assignments  

60%   79%  

The  majority  of  students  in  2201  seemed  to  be  able  to  make  connections  between  what  we  did  in  the  class  and  what  they  will  do  in  their  majors/career  areas  

52%   84%  

Table 19: ENGL 2201 Faculty Responses to Engagement and Disciplinary Connections Statement

The increased agreement reflected in these survey results suggests that the various actions discussed in the 2015-16 QEP Assessment Report—promoting the discipline-specific writing resources available via the Writing@ECU website and implementing practices to ensure more appropriate enrollment in discipline-themed sections of ENGL 2201—have had a positive impact on the course as perceived by instructors

4.3.2 Contribution to QEP SLOs

Instructors were also asked how much they felt that the ENGL 2201 curriculum has helped their students in moving toward the QEP SLOs. The table below includes responses from the spring 2017 and spring 2016 semesters for comparison. Note that, similar to the QEP WI rubric, the survey broke SLO 1 into two parts.

Page 15: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

15

Learning  Outcome   Spring  2016  

%  Responding  "A  lot"  or  "A  Moderate  Amount"  n=20  

Spring  2017  %  Responding  "A  lot"  or  "A  Moderate  Amount"  n=18  

1A-­‐-­‐Using  writing  to  investigate  complex  topics  and  address  significant  questions.  

70%   89%  

1B-­‐-­‐Locating  and  integrating  credible  research  sources  into  their  writing.  

95%   89%  

2-­‐-­‐Producing  writing  that  effectively  addresses  contexts,  purposes,  and  audiences.  

80%   94%  

3-­‐-­‐Using  drafting  and  revision  to  improve  their  writing.  

70%   72%  

4-­‐-­‐Proofreading  and  editing  to  avoid  grammatical  and  mechanical  errors.  

60%   56%  

5-­‐-­‐Explaining  and  assessing  the  major  choices  that  they  make  in  their  writing  

70%   83%  

Table 20: ENGL 2201 Faculty Responses to QEP SLO Statement

Responses suggest that proofreading and editing continue to be a challenge for many students and instructors. Two QEP/UWP Writing and Learning Communities focused on proofreading and editing and developed additional resources in the 2016-2017 academic year. Those resource, while shared at two professional development session in the spring 2017 semester and added to the Writing@ECU website this fall, would not have been deployed in time to affect responses to the survey for this past year. With promotion of these resources, particularly among instructors of ENGL 2201, we hope to see higher positive responses in future years relating to SLO 4.

4.3.3 Comparison to ENGL 1200

Survey respondents who indicated that they had taught ENGL 1200 in the past were asked to compare the performance of students in ENGL 2201 to the performance of students in ENGL 1200. Results from spring 2016 and spring 2017 are compared in the table below.

Performance  Area   Spring  2016  %  "Much  Better"  or  "Somewhat  Better"  n=10  

Spring  2017  %  "Much  Better"  or  "Somewhat  Better"  n=12  

Engagement  with  course  material   40%   58%  Effort  exerted  on  assignments   30%   58%  Contributions  to  peer  review   10%   42%  Participation  in  class  activities   10%   33%  Understanding  course  readings   20%   25%  

Table 21: Instructor Perception of Student Performance in ENGL 1200 versus ENGL 2201

This year's increase in percentage of instructors who felt that students performed better in ENGL 2201 than they did in ENGL 1200 may reflect that both students and faculty are more familiar with the curriculum. It may also be a result of the better alignment of students within discipline-themed sections. It will become increasingly difficult to get indirect assessment data about this comparison as fewer and fewer instructors will have had experience teaching ENGL 1200 following the curriculum shift, but we hope to see further increases, particularly in the area of "understanding course readings" in the final year

Page 16: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

16

of the QEP. To this end, the QEP Director will work with the Writing Foundations Director and the Writing Liaisons to develop lists of discipline-focused topics and readings that may be more accessible to sophomore-level students who are new to the field. Additionally, because students' struggles to understand course readings may require further attention to some of the same strategies that are needed to achieve SLO 1b—"engagement with and effective use of outside sources" (being able to summarize accurately or to identify how different concepts in readings relate to one another, for example)--promotion of the resources identified in section 2.1.4 above will be implemented among instructors and students in ENGL 2201 during the 2017-2018 academic year.

4.4 Results and Discussion of ENGL 2201 Student Survey

Students in ENGL 2201 were surveyed during the final two weeks of the spring 2017 semester. Faculty teaching ENGL 2201 were asked to encourage students to take the brief survey. Of the approximately 1450 students in 63 sections of ENGL 2201 offered in spring 2017, a total of 304 responded, for a response rate of 22%. Not all respondents answered all questions.

4.4.1 Connection to Major/Career Area

To better understand students' perceptions of the connections between the work done in 2201 and the writing they may encounter in future courses and their careers, the survey asked for respondents to indicate level of agreement with three statements about these connections. Percentage of respondents indicating that they agree or strongly agree with the statements from the spring 2016 and spring 2017 surveys are included below:

Question   Spring  2016  %  Agree  or  Strongly  Agree  n=156  

Spring  2017  %  Agree  or  Strongly  Agree  n=292  

ENGL  2201  has  helped  me  better  understand  how  writing  works  in  my  major/intended  career.  

59%    

67%  

The  assignments  we  have  done  in  ENGL  2201  will  apply  to  my  major  area.  

45%    

57%  

The  things  we  have  done  in  ENGL  2201  will  apply  to  my  career.  

36%    

53%  

Table 22: Student Perception of Connections between ENGL 2201 and Major/Career Area

In line with the increase in agreement among instructors to the statement "The majority of students in 2201 seemed to be able to make connections between what we did in the class and what they will do in their majors/career areas" (see 4.3.1 above), revisions and practice seem to have improved the effectiveness/impact of ENGL 2201 on students over the QEP period.

5. Student Support Initiatives Assessment

5.1 Results and Discussion of Writing Mentors Program Surveys

Surveys to measure the impact of the Writing Mentors program were distributed in fall 2016 and spring 2017 to faculty who worked with Mentors in their WI courses, to the students in those courses, and to the Mentors themselves. As a reminder, the Mentor Program is designed such that only certain courses include work with a

Page 17: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

17

Writing Mentor: faculty must apply to work with a Mentor, and a properly qualified Mentor must be available at the time of the faculty member's class.

5.1.1 Faculty Survey

In fall of 2016, seven participating faculty members completed the survey. In spring 2017, four participating faculty completed the survey.

The survey provided a series of statements about the helpfulness of Mentors in different areas of writing and asked faculty to rate, on a scale of 1 (strongly disagree) to 5 (strongly agree) how much they agreed or disagreed with the statement. Percentage of respondents indicating "Agree" or "Strongly Agree" are included in the table below.

How  much  do  you  agree  or  disagree  with  the  following  statements?  (1-­‐5  scale,  1=strongly  disagree  and  5=strongly  agree)    The  Writing  Mentor  in  my  class  helped  students  to…  

Fall  2016  %  "Agree"  or  "Strongly  Agree  n=7  

Spring  2017  %  "Agree"  or  "Strongly  

Agree  n=4  

Understand  writing  assignments   86%   100%  Develop  ideas  for  writing   43%   100%  Establish  and  maintain  a  thesis/focus   86%   100%  Find  good  outside  sources  (books,  articles,  web  sites,  etc.)  for  writing  

71%   75%  

Understand  audience  when  writing   57%   100%  Write  multiple  drafts  of  assignments   86%   100%  Revise  (make  substantive  changes  to)  writing   86%   100%  Edit/proofread  writing   86%   100%  Recognize  strengths  and  weakness  in  their  own  writing   86%   75%  

Table  23:  Instructor  Perceptions  of  Writing  Mentor  Impact  

While the small number of faculty participants in spring 2017 makes interpretation of results difficult, the rise in percentages for several of the statements between fall 2016 and spring 2017 is encouraging. The increase in percentage of responses indicating agreement in the area of "find good outside sources for writing" is encouraging, although the consistency with which this area receives the lowest percentage of agreement responses suggests that it is an area for continued focus in Mentor professional development.

The data from the closed-ended questions on the faculty surveys are instructive, but the details provided in open-ended responses reveal specific ideas for what works well in the program and what might be improved. Benefits of the Writing Mentors Program highlighted in faculty survey responses from fall 2016 and spring 2017 include

• It provides a supportive, approachable peer for students: 7/11 respondents (64%) mentioned this benefit.

"Repeated contact with a mentor that is closer to a peer for the student, and a raised awareness of the importance of class writing assignments are the biggest benefits."

"[Students] seemed to enjoy [the mentor's] personality and his advice."

"Many students may fear going to the professor for help with writing. Having the mentor provides these students with an outlet for support."

"I also find that many students won't come to the instructor for help (no matter how 'open' the instructor's door remains); however, students often feel comfortable going to a fellow student for help and advice."

"I think it introduces a peer who is more approachable."

"A peer mentor who students feel comfortable with and from whom they might be more willing to accept advice."

Page 18: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

18

• It provides additional feedback for students: 4/11 respondents (36%) mentioned this benefit.

"It gave the students an extra person to check in with about some of their writing concerns."

"The Writing Mentor provides an additional source for students to use when working on their writing."

"It also introduces another authority that provides a rich perspective."

"Having a writing mentor to provide targeted feedback and guidance - especially on assignment details and things like citation resources really helps me as an instructor focus on bigger picture concerns with student writing. It's fantastic!"

• It increases students' engagement with the writing process: 3/11 respondents (27%) mentioned this benefit.

"Undergrads [in this program] are absolutely swamped with other challenging tasks…, so writing tends to get left out; this program keeps it higher on their list."

"It allows me to require more by way of drafts and revisions for major writing assignments. The students get practice doing this."

"[It] encourages editing and revising."

Suggestions for improvement from the surveys in fall 2016 and spring 2017 were somewhat limited, with several respondents suggesting that there are no ways to improve the program. One area was, however, offered for improvement:

• Expand the Program/Provide More Discipline-specific Mentors: 5/11 respondents (45%) made this kind of suggestion.

"I would say the biggest issue is that the resource is not available every semester for every class. I realize this is a financial and human resource issue - but one can dream!"

"Get more writing mentors (somehow)."

"Find discipline-specific mentors"

"If I could suggest any improvement to the program, I would suggest mentors have at least taken the course for which they will serve as a mentor. If this isn't an option, at least have the mentor be part of the major. Being part of the major at least provides them with an understanding of content and context that can be applied to their understanding of the assignment and their mentoring sessions with students. I realize this isn't always a feasible option, but it helped greatly in my situation."

"Mentor experience as a mentor is important, but even more so, experience in the discipline they are working with."

5.1.2 Student Survey Results

In fall 2016, a total of 49 students responded to the survey distributed to classes with a Writing Mentor assigned (a response rate of approximately 30%). In spring 2017, a total of 39 students responded (a response rate of approximately 31%).

Students were asked to indicate how much they felt the Mentor(s) working with their classes had helped them with various writing tasks by indicating their agreement with a set of statements aligned with the QEP SLOs. Percent of respondents indicating "Strongly Agree" or "Agree" from fall 2016 and spring 2017 are included in the table below.

Page 19: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

19

How  much  do  you  agree  or  disagree  with  the  following  statements  (1-­‐5  scale,  1=strongly  disagree  and  5=strongly  agree)    The  Writing  Mentor  helped  me  to…  

Fall  2016  %  "Agree"  or  "Strongly  Agree  n=49  

Spring  2017  %  "Agree"  or  "Strongly  Agree"  n=39  

Understand  writing  assignments   68%   90%  Develop  ideas  for  writing   80%   90%  Establish  and  maintain  a  thesis/focus  for  my  writing   71%   87%  

Find  good  outside  sources  (books,  articles,  web  sites,  etc.)  for  my  writing  

53%   64%  

Understand  my  audience  when  writing   65%   85%  Write  multiple  drafts  of  my  assignments   70%   82%  Revise  (make  substantive  changes  to)  my  writing   80%   92%  

Edit/Proofread  my  writing   74%   92%  Recognize  strengths  and  weakness  in  my  own  writing   67%   87%  

Address  weaknesses  in  my  own  writing   65%   87%  

Table 24: Student Perceptions of Writing Mentor Impact

Despite the lower percentage of "Agree" or "Strongly Agree" responses in fall 2016, students continue to report that Writing Mentors have helped. The area of "finding good outside sources" remains consistently the area with lowest rates of agreement from both students and faculty. This may be because faculty are not directing students to work with the Mentor on this part of the writing process, or it may be because the Mentors are not as adept at this part of the process as they are at the others, or it may be a combination of these things. As noted above, greater efforts will be made this academic year (2017-2018) to publicize resources available through both the Writing@ECU website and the Joyner Library website.

As was the case with the faculty survey, students' open-ended comments provide useful information about the benefits and areas for improvement in the Writing Mentors Program. Many benefits were mentioned, and many responses were general praise (e.g., "The mentor provided good feedback"), but two themes emerged across multiple open-ended responses:

• The Mentor Program provides students with a supportive and approachable peer (as opposed to professor): 14/79 respondents (18%) mentioned this benefit.

"I like the fact that I had someone to go to other than my professor to talk to about my paper."

"Having another person available to ask questions besides the instructor creates a level of comfort for many students, especially when the writing mentor is a student. They are more likely to ask questions and express feelings that they would not feel comfortable asking a professor."

"I think the best benefit is having someone there that you can rely on that isn't your professor to help you through the writing process. Sometimes people are afraid to ask their professor for help, so this is the best option for them."

"It helped me get the perspective of someone else without the pressure of it being my professor."

• A Mentor familiar with the course/discipline helps to clarify assignments and expectations for students: 10/79 respondents (13%) mentioned this benefit.

"[The Mentor] has taken this class before so she knew what was expected from us which really helped."

Page 20: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

20

"The biggest benefit of the writing mentor program is some have already taken a class that a student is currently in and can relate to the assignment and make it easier for them to understand."

"It was nice to have to opportunity to talk with a student who has previously taken this course because they know exactly what the teacher is looking for in each assignment."

"It's good to have someone who has already taken the class and had experience with the teacher to give you advice on assignments."

Additionally, four respondents indicated that working with a Writing Mentor resulted in increased confidence in their writing. Bolstering confidence was a theme mentioned in previous years' student survey responses about the Writing Mentor Program as well.

One recommendation for improvement turned up multiple responses:

• Improve scheduling and increase options for times to meet with Mentor: 21/42

respondents (50%) made this suggestion.

As has been the case throughout the implementation of the Writing Mentor Program, students report that Writing Mentors are valuable additions to their writing-intensive courses. Scheduling times for Mentors and students to meet is the one area in which difficulties persist. Given the very busy and diverse schedules kept by students and Mentors, this issue is difficult to resolve without substantial expansion of the program to add Mentors and, hence, availability to meet with students. Such an expansion is something the University Writing Center hopes to do in future years, perhaps with the assistance of outside donors. Another approach to add scheduling flexibility to the Mentor Program is being piloted in fall 2017: a "consultant-liaison" model in which a Writing Mentor is paired with a faculty member who teaches a WI course but, rather than the Mentor working directly with each student in the class, students follow the standard procedure for making appointments with consultants in the Writing Center: They schedule appointments with any consultant who is available at times that work with their schedules. The Mentor, in this arrangement, serves as a go-between, letting the instructor know what writing issues consultants are seeing in the work of students from the class; providing support to the instructor by suggesting revisions to assignments, activities, rubrics, and the like; and by developing handouts and other supplemental materials to target the areas in which students are struggling with writing. At the same time, the Mentor shares details about the course, the assignments, the instructor's expectations and concerns related to students' writing, and so on, with the UWC consultants so that those consultants are better prepared to help students from the class. If this approach proves useful, the pilot may be expanded in the spring 2018 semester.

5.1.3 Mentor Survey Results

In fall 2016, 7 Writing Mentors responded to the survey about their experiences. In spring 2017, 5 Mentors responded.

Mentors were asked how much of an impact they felt they had on student writers by indicating agreement with a series of statements. Results are below.

Page 21: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

21

How  much  do  you  agree  or  disagree  with  the  following  statements  (1-­‐5  scale,  1=strongly  disagree  and  5=strongly  agree)    You  were  able  to  help  the  students  to…  

Fall  2016  %  "Agree"  or  

"Strongly  Agree  n=7    

Spring  2017  %  "Agree"  or  "Strongly  Agree"  n=5  

Understand  writing  assignments   100%   100%  Develop  ideas  for  writing   100%   100%  Establish  and  maintain  a  thesis/focus  for  their  writing   100%   100%  

Find  good  outside  sources  (books,  articles,  web  sites,  etc.)  for  their  writing  

57%   80%  

Understand  the  audience  when  writing   86%   100%  Write  multiple  drafts  of  their  assignments   73%   60%  Revise  (make  substantive  changes  to)  their  writing   100%   80%  

Edit/Proofread  their  writing   86%   80%  Recognize  strengths  and  weakness  in  their  writing   86%   80%  

Address  weaknesses  in  their  own  writing   86%   100%  

Table 24: Writing Mentors Perceptions of Their Own Impact

In keeping with previous years of QEP assessment, scores from the Mentors across all the outcomes are consistently high. Also in line with previous years' responses, an area of student writing that Mentor survey responses suggest as important for further attention is the practice of composing multiple drafts. It's also quite possible that, with revision occurring in electronic files, finite, identifiable "drafts" are harder to identify and thus harder to quantify as "multiple." Rather than producing multiple drafts, writers have a file—a draft—that they continually revise. The fact that there are 20-30% more positive responses from Mentors to the statement "You were able to help students revise (make substantive changes to) their writing" suggests that this model, in which there is one "draft" that is constantly under revision, may account for the lower scores in response to the statement, "write multiple drafts of their assignments."

5.1.4 Comparison of Writing Mentor Program Perceptions of Impact

The tables below allow for comparison of faculty, mentor, and student perceptions of the impact of the Writing Mentors Program.

Page 22: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

22

Fall 2016 Comparison

How  much  do  you  agree  or  disagree  with  the  following  statements  (1-­‐5  scale,  1=strongly  disagree  and  5=strongly  agree):      The  Writing  Mentor  helped  me/students  to…  

Mentor  %  "Agree"  or  "Strongly  Agree  

 

Faculty    %  "Agree"  or  "Strongly  Agree  

Student  %  "Agree"  or  "Strongly  Agree  

   

Understand  writing  assignments   100%   86%   68%  Develop  ideas  for  writing   100%   43%   80%  Establish  and  maintain  a  thesis/focus   100%   86%   71%  Find  good  outside  sources  (books,  articles,  web  sites,  etc.)  for  writing  

57%   71%   53%  

Understand  audience  when  writing   86%   57%   65%  Write  multiple  drafts     73%   86%   70%  Revise  (make  substantive  changes  to)  writing  

100%   86%   80%  

Edit/proofread  writing   86%   86%   74%  Recognize  strengths  and  weakness  in  writing  

86%   86%   67%  

Table 25: Comparison of Writing Mentor, Faculty, and Student Perceptions of Impact, Fall 2016

Spring 2017 Comparison

How  much  do  you  agree  or  disagree  with  the  following  statements  (1-­‐5  scale,  1=strongly  disagree  and  5=strongly  agree):      The  Writing  Mentor  helped  me/students  to…  

Mentor  %  "Agree"  or  "Strongly  Agree  

 

Faculty    %  "Agree"  or  "Strongly  Agree  

Student  %  "Agree"  or  "Strongly  Agree  

   

Understand  writing  assignments   100%   100%   90%  

Develop  ideas  for  writing   100%   100%   90%  

Establish  and  maintain  a  thesis/focus   100%   100%   87%  

Find  good  outside  sources  (books,  articles,  web  sites,  etc.)  for  writing  

80%   75%   64%  

Understand  audience  when  writing   100%   100%   85%  

Write  multiple  drafts     60%   100%   82%  

Revise  (make  substantive  changes  to)  writing  

80%   100%   92%  

Edit/proofread  writing   80%   100%   92%  

Recognize  strengths  and  weakness  in  writing  

80%   75%   87%  

Table 26: Comparison of Writing Mentor, Faculty, and Student Perceptions of Impact, Spring 2017

It is worth noting that, in both semesters reported above, the area of "finding good outside sources for writing" received "agree" or "strongly agree" responses in less than 80% of surveys, with the exception

Page 23: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

23

of Mentors' responses for in fall 2016 (80%). This same area was identified as an area for attention in last year's QEP Assessment Report, and, in light of these results, it will continue to receive attention in programs during the 2017-2018 academic year. Mentor training will again incorporate work with librarians who specialize in research practices and database resources in different disciplines. Additionally, Mentors and faculty teaching WI courses will be encouraged to promote the many resources available on the Writing@ECU website and the Joyner Library website related to finding good sources.

5.2 Results and Discussion of University Writing Center Assessments

Usage data for the University Writing Centers (at Joyner and Laupus Libraries and through the Online Writing Lab) continues to demonstrate the impact the QEP expansion has had, with increasing numbers of students assisted each year. The table below provides usage data from this academic year as well as the past four academic years:

UWC  Location   2012-­‐2013  (pre-­‐QEP)  

2013-­‐2014  (QEP  Yr.  1)  

2014-­‐2015  (QEP  Yr.  2)  

2015-­‐2016  (QEP  Yr.  3)  

2016-­‐2017  (QEP  Yr.  4)  

Face-­‐to-­‐face  sites   1,918   3,755   3,809   5,115   5,149  

Online  Writing  Lab   561   1,022   1,275   1,946   3,660  

Total     2,479   4,777   5,084   7,061   8,089  

Table 27: University Writing Center Usage, 2012-2017

Of course, usage should not be the only measure of the impact of the UWC expansion. As noted in the assessment plan for the QEP, UWC exit surveys were administered to all ECU community members who received assistance beginning in the 2012-2013 academic year. Thus, pre-QEP implementation response data can be compared with post-QEP implementation response data to determine if satisfaction with the UWC's services has changed at all with the expansion. Data for spring 2013 (pre-QEP implementation) is not available due to technical problems that arose during the process of moving into a temporary space while the new UWC space was under construction:

Level  of  Satisfaction  

Fall  2012  

 

Fall  2013  

 

Spring  2014  

 

Fall  2014    

Spring  2015  

 

Fall  2015  

Spring  2016  

Fall  2016  

Spring  2017  

Very  Satisfied  

78%   77%   81%   73%   81%   79%   81%   81%   84%  

Satisfied   22%   22%   19%   26%   19%   20%   18%   18%   15%  Dissatisfied   0%   1%   0%   1%   1%   1%   .5%   .3%   .4%  Very  Dissatisfied  

0%   0%   0%   0%   0%   0%   0%   .1%   .1%  

Table 28: University Writing Center Exit Survey "Satisfaction" Question Responses, 2012-2017

Satisfaction levels among users of the UWC remained consistently high from pre-to-post QEP implementation, suggesting that the expansion of the center has not had a negative impact on the quality of the help consultants provides for students.

6. Faculty Support Initiatives Assessment

6.1 Results and Discussion of Summer WAC Academy Survey

Four of five participants—representing programs in English, Criminal Justice, Education, and Foreign Languages and Literatures—responded to a Qualtrics survey about the summer 2017 Advanced WAC Academy. Numbers in the academy are kept purposefully low to enable extensive interaction and collaboration.

Respondents were asked to indicate level of agreement on a number of statements about their experience in the academy. Response numbers are below.

Page 24: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

24

Statement   Strongly  disagree    

1  

Disagree    2  

Agree          3  

Strongly  agree  4  

I  have  a  better  understanding  of  how  to  help  my  students  become  stronger  writers  now  than  I  did  before  the  Academy.  

0   0   0   4  

The  readings  and  activities  will  help  inform  my  planning  and  instruction  to  encourage  the  transfer  of  skills  and  knowledge.  

0   0   0   4  

The  group  discussions  throughout  the  Academy  were  useful  for  informing  my  instruction.  

0   0   0   4  

I  will  use  the  knowledge  and  materials  I  gained  in  the  Academy  in  my  classroom.  

0   0   0   4  

I  plan  to  share  the  knowledge  and  materials  gained  in  the  Academy  with  my  colleagues.  

0   0   1   3  

I  would  recommend  the  WAC  Academy:  Transfer  of  Writing  Skills  and  Knowledge  to  someone  else  in  my  department.  

0   0   0   4  

Table 29: Responses to 2017 Summer WAC Academy Survey

Survey results suggest that the Advanced WAC Academy continues to positively impact participants.

6.2 Results and Discussion of Writing Liaisons Survey

A survey was distributed to Writing Liaisons at the conclusion of the spring 2017 semester. Thirty-one Liaisons completed the survey (a 79% response rate). Of those who responded, 86% rated their involvement with the Writing Liaisons program as "Good" or "Very Good."

Given that a primary goal of the Liaisons program is to increase communication across campus about writing, writing instruction, and writing support, Liaisons were asked to indicate how often they discuss information that they have received through the Liaisons program with others in their departments, programs, and/or colleges. Data from the 2015-2016 Liaisons' survey is included for comparison.

How  frequently  during  your  time  as  a  Writing  Liaison  have  you  talked  with  colleagues  in  your  program,  department,  or  college  about  issues  related  to  the  QEP,  the  University  Writing  Program,  the  University  Writing  Center,  or  the  Writing  Foundations  program?  Response   %  of  Responses  

2014-­‐15  %  of  Responses  

2015-­‐16  %  of  Responses  

2016-­‐17  Never   0%   4%   0%  Less  than  Once  a  Month  

40%   27%   17%  

Once  a  Month   32%   46%   48%  2-­‐3  Times  a  Month  

28%   23%   21%  

Once  a  Week   0%   0%   10%  2-­‐3  Times  a  Week  

0%   0%   4%  

Daily   0%   0%   0%  

Table 30: Writing Liaisons' Time Spent Communicating with Colleagues about Writing Programs

Over the course of the QEP, the percent of respondents indicating that they report to their colleagues less often than once per month has dropped significantly, a reflection of many efforts on the part of the QEP team and the

Page 25: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

25

Liaisons themselves, including the now regular practice of the QEP Director sharing "talking points" with Liaisons after monthly meetings and the Liaisons forwarding those points to program colleagues.

Accuracy of the information that Liaisons share is as important as the frequency of discussion. Thus, Liaisons were asked to rate their ability to explain various aspects of the QEP and/or the University Writing Program to their colleagues. Percentages of respondents who indicated each rating are indicated below, along with data from the two previous annual Liaison surveys for comparison:

Area     %  Good  or  Very  Good  2014-­‐15  

%  Good  or  Very  Good  2015-­‐16  

%  Good  or  Very  Good  2016-­‐17

The  services  of  the  University  Writing  Center  

79%   89%   86%  

The  Writing  Mentors  Program  

84%   80%   82%  

The  structure  and  goals  of  the  WAC  (Writing-­‐intensive)  program  

71%   65%   86%  

The  University  Writing  Portfolio  

63%   65%   75%  

Writing  and  Learning  Communities  

37%   42%   71%  

The  QEP  Student  Learning  Outcomes  

71%   77%   82%  

The  QEP/WI  assessment  process  

55%   61%   82%  

The  goals  and  rationale  for  the  revised  Writing  Foundations  (English  1100  and  2201)  sequence  

75%   80%   89%  

Table 31: Writing Liaisons' Perception of Knowledge about Writing Programs at ECU

As noted in last year's QEP assessment report, efforts were undertaken in 2016-17 to better publicize the purposes and functions of the "Writing and Learning Communities" in response to the fact that only 42% of Liaisons reported that their knowledge of this program was "good" or "very good." Those efforts appear to have paid off with the jump to 71% of Liaisons reporting "good" or "very good" levels of familiarity with the WLCs from 2016-2017.

Also worth noting is the increase in percentage of Liaisons reporting "good" or "very good" knowledge of the QEP/WI assessment process between AY 2014-15 and AY 2016-17. That 82% of Liaisons now feel that they understand and are prepared to explain this assessment process suggests that the process is positioned well to continue post-QEP. Additionally, it merits mention that over the 4 years of the QEP, 13 different Writing Liaisons have served on the WAC Committee of the Faculty Senate, including the current Chair of that committee. The Liaisons program enables connections and fosters commitment to writing across the university.

6.3 Results and Discussion of Metacognition Workshop Series Surveys

A total of 7 faculty members from various disciplines completed the three-session workshop during AY 2016-17. Six workshop participants completed a survey about the workshop series. Percentages of respondents answering "Excellent" and "Good" are provided below, along with percentages from AY 2015-16 and AY 2014-15 for comparison.

Page 26: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

26

Question    

2014-­‐15  Excellent  

2015-­‐16  Good  

2015-­‐16  Excellent  

2015-­‐16    Good  

2016-­‐17  Excellent

2016-­‐17  Good

How  organized,  knowledgeable,  and  prepared  was  the  presenter?  

91%   9%   94%   6%   100% 0%

How  effective  was  the  presentation  of  content  (interesting  and  engaging?  

80%   20%   100%   0%   67% 33%

To  what  extent  do  you  think  the  workshop  content  will  be  applicable  and  useful?  

82%   8%   94%   6%   82% 8%

The  quality  of  the  materials  and  resources  offered  were…  

80%   20%   94%   6%   66% 33%

To  what  extent  did  the  presenters  provide  opportunities  for  discussion,  interaction,  and  questions?  

100%   0%   100%   0%   100% 0%

How  appropriate  was  the  amount  of  time  allotted  for  this  workshop?  

64%   36%   75%   25%   66% 33%

Overall,  I  found  this  series  to  be…  

82%   8%   94%   6%   83%   17%  

Table 32: Writing and Metacognition Workshop Series Participant Survey Results

Feedback for the series continues to be positive. Changes made to the structure of the workshops to allow for more time appear to have improved participants' experiences. In AY 2017-18, the workshop series leaders will continue to focus on adjusting activities to ensure they all fit into the one-hour slot. Many faculty can only stay for one hour due to teaching schedules, so extending the workshop time is not a feasible option. Given that the limited time available for workshop sessions makes it difficult to discuss multiple strategies for promoting metacognition, we have made multiple resources available for faculty and students on the Writing@ECU website and we continue to publicize and add to these resources.

6.4 Results and Discussion of Writing and Learning Communities Survey

Early in the fall 2016 semester, a call for WLC participants was circulated via Announce and via an email distribution list of writing-intensive course instructors. Nine people responded, and two WLC groups were formed to further investigate strategies for helping students become more effective editors and proofreaders of their own writing. Two participants were unable to complete the WLC process over the full academic year, resulting in two WLCs with a total of 7 participants: one WLC with 3 members, and one WLC with 4 members. The work of these 7 faculty members, who represented multiple areas (Communication, English, Health Education and Promotion, and the Career Center), culminated in spring 2017 with two workshops in which WLC participants shared specific teaching and feedback strategies that they had developed to help students improve in the area of SLO 4. The strategies included the following:

Page 27: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

27

• Using read-aloud audio recordings to help students identify and address errors in drafts • Employing minimal-marking in responding to student writing to promote students' abilities to

recognize, identify, and revise sentence-level problems • Assigning cross-genre projects that ask students to write similar information/ideas in

significantly different styles to help them better recognize and adapt to different stylistic expectations

• Asking students to take on the perspective of an editor by analyzing and applying a style guide from a well-known publication (in this case, Wikipedia)

• Having students take on the perspective of an employer who must rapidly review résumé and cover letter drafts composed by other students

• Providing students with direct instruction in how dialects differ, particularly between local/home contexts and school/work contexts.

 

6.4.1 WLCs Feedback Survey

The 6 faculty members who completed the WLC Feedback Survey reported high satisfaction, with 100% reporting their involvement was "Very Good." All respondents also reported that they were "Pleased" or "Very Pleased" with all 10 facets of the WLCs presented in the survey. Additionally, all respondents reported that it is "Very Likely" that the experience will influence how they approach teaching writing.

     How  pleased  were  you  with  the  following  aspects  of  your  WLC?  If  an  aspect  in  the  list  does  not  apply,  simply  leave  it  blank.    Responses  on  a  1-­‐5  scale  (1=Very  Displeased,  5=Very  Pleased)

Aspect   2015-­‐16  Mean  

2016-­‐17  Mean  

Format/Structure   4.4   5.0  Amount  of  work   4.5   5.0  Meetings/Interaction    (face-­‐to-­‐face)  

4.6   5.0  

Meetings/Interaction  (online)  

4.5   4.3  

Reading   4.6   4.8  Quality  of  materials  and  projects  developed  

4.5   5.0  

Timeline  for  work   4.5   5.0  Stipend  for  participation   4.1   5.0  Process  of  developing  materials  and  projects  

4.5   5.0  

Opportunities  for  discussion,  interaction,  and  questions  

4.6   5.0  

Table  33:  WLC  Feedback  Survey  Results  

It  is  notable  that  all  6  respondents  also  indicated  that  WLCs  should  have  "the  same  amount"  of  structure  when  asked  if,  in  the  future,  the  communities  should  have  more,  less,  or  the  same  amount  of  structure.  The  amount  structure  for  the  WLCs  has  been  an  area  marked  for  improvement  in  the  past,  so  this  year's  survey  responses  are  very  encouraging.      Feedback in response to open-ended prompts included the following:

• "This workshop was extremely informative, for it provided information that can easily be applied in the classrooms across the curriculum. Every meeting was interesting and enlightening. Each session motivated me to try something new."

• "This was a wonderful experience for me. I learned so much from the readings and from the other people in the group. I will use this experience in all of my classes."

Page 28: 2016-17 QEP Assessment Report Final · QEP Assessment Report 2016-17 1. Introduction Academic year 2016-17 was the fourth full year of QEP implementation. This report provides descriptions

28

• "The amount of meetings/readings/overall work was perfect for me, and discussion was great."

7. Assessment Activities Planned for 2017-2018 In addition to continuing many of the assessments discussed in this report, AY 2017-18 will include the following assessment-related projects for the QEP:

• Post-implementation assessments of WI writing samples for remaining programs.

• Review and analysis of participant responses to 2017 Eastern Carolina Writing Symposium survey.

• Implementation and analysis of Post-QEP Student Survey of Writing Experiences.

• Implementation and analysis of Post-QEP Faculty Survey about Student Writing.

• External Assessment of 10% of WI samples from 2017 Summer WI Assessment.