Upload
grace-hutchison
View
214
Download
2
Tags:
Embed Size (px)
Citation preview
The Academic Program Review
Bridging Standards 7 and 14
Middle State’s Annual Conference
December 10, 2010
Presenters
Mr. H. Leon Hill
Director of Institutional Research
Dr. Joan E. Brookshire
Associate Vice President of Academic Affairs
Overview
• Framework to Address the APRs
• Structure/Challenges/Approach
• Examples of Metrics
• Current Action Plan
• Integration of “End User” Technology
• Next Steps
• Benefits of Our Approach
• Questions
Assessment Cycle-2005
What we had to build on
• Strong focus on programs.
• State mandated 5-year academic program review in need of revision.
• Institutional Effectiveness Model (IEM) with performance indicators benchmarked through State and National data bases.
Mission
Strategic Initiative:
Access & Success
Institutional Effectiveness
IEM
• Needed a way to assess how the College was performing on key metrics in relation to prior. years/semesters and compared to other institutions.
• Historical/Trend data
• Benchmark data– Pennsylvania & National Peers
• Institutional Effectiveness Model
Where we started
• Restructured the Academic Program Review process
• Incorporated the use of technology
Goal of the restructuring
• Measure student performance as evidence by results of assessment of student learning outcomes.
• Measure program performance as evidenced by comparison of program performance to overall college performance on specific key indicator (current and aspirational).
Challenges
• Usual issues with assessment in general.
• Faculty had little knowledge of the College’s performance indicators.
• Organizational separation of assessment of institutional and student learning outcomes.
Approach
Began by building it backwards from the IEM by mapping out specific core indicators to program data, making additions where needed.
Examples of Metrics Used for APR
TARGETS Caution Acceptable Aspirational
Graduation Rate
<19% 19%-23% >23%
TARGETS Caution Acceptable Aspirational
Transfer Rate <29% 29%-32% >32%
Definitions of Success & Retention
Success=Grades of (A,B,C & P)/(A, B, C, D, P, D, F, & W)
Retention=Grades of (W)/(A, B, C, D, P, D, F, & W)
Added a curricular analysis
• How well program goals support the college’s mission.
• How well individual course outcomes reinforce program outcomes.
• How well instruction aligns with the learning outcomes.
• Specific assessment results.
• Changes made based on the assessment findings.
• Evidence of closing the loop
• Changes made to the assessment plan.
Action Plan
• Outcomes expected as a result of appropriate actions steps.
• Timelines and persons responsible for each action step.
• Resources needed with specific budget requests.
• Evaluation plan with expected benefits.
Bottom Line
• Is there sufficient evidence that the program learning outcomes are being met?
• Is there sufficient evidence that the program is aligned with the college on specific key indicators?
The Framework
STANDARDS 1-7 Focus on Institutional Performance
Standard 7: Assessment of Institutional Effectiveness How well are we collectively doing what we say we are doing, with a specific focus on supporting student learning. Assessment must be included in Standards 1-6.
STANDARDS 8-14 Focus on Student Performance
Standard 14: Assessment of Student Learning Do we have clearly articulated learning goals, offer appropriate learning activities, assess student achievement of those learning outcomes, and use the result of assessment to improve teaching and learning and inform budgeting and planning. Also in Standards 8-13.
Strategic Analysis
Institutional data Link to the IEM
through use of common data sets
Curriculum Analysis Course assessment Program Assessment Core assessment
Planning and Budgeting (Standard 2)APR Action PlanAPR Annual ReportAnnual Academic Planning
Assessment Results
Curriculum Committee
President’s Office
Curriculum BOT & BOT
Addition of Technology
• Worked in concert with Information Technology to integrate iStrategy with ERP (Datatel).
• The implementation of this permitted end users to obtain the data needed for program assessment, without the middle man (IR and/or IT).
Next Steps in the Evolution of of College and Program Outcomes
Example of APR Report Card
Persistence Fall to Fall
2005 2006 2007 Full-time
Persistence Program 75% 50% 70% College 64.39% 63.16% 50%
Part-time Persistence
Program 25% 43.48% 33.33% College 33.52% 37.09% 34.41%
Persistence - These data are based on a cohort of first time students from a specific semester and follows their enrollment patterns one year out
Fall to Spring 2006 2007 2008
Full-time Persistence
Program 70% 80% 76.92% College 78.19% 80.98% 80.93%
Part-time Persistence
Program 69.57% 45.83% 46.15% College 45.12% 41.60% 39.00%
Persistence - These data are based on a cohort of first time students from a specific semester and follows their enrollment patterns from Fall semester to the following Spring semester
Graduation and Transfer
Degrees Conferred
2006 2007 2008 Three Year
Percent Change
Program 9 9 8 -11.11% College 1058 1046 1127 6.52%
Degrees Conferred – These are the actual number of degrees conferred, not the degrees earned
Pass Rates on Licensure Exams
Graduation and Transfer Rate: Cohort Entering 2005
Total
Students Degrees
Graduation Rate
(Within 3 Years) Transfer
Transfer Rate
(Within 3 Years)
Full-time Program 14 1 7.14% 5 35.71% College 1524 192 12.60%* 402 26.38%
Part-time Program 9 0 0% 4 44.44% College 990 26 2.62% 232 23.43%
First time students to College, no transfer credits *Acceptable 19-23%
Fiscal
Evidence of High Priority and Employment Predictions
• Examples of Course Success
Success in ACC 111
2003/FA 2004/FA 2005/FA 2006/FA
2007/FA
2008/FA
2009/FA
% Success 61.4% 57.1% 55.4% 55.3% 51.4% 44.2% 48.3%# Success 329 276 253 281 261 244 249% Non Success 38.6% 42.9% 44.6% 44.7% 48.6% 55.8% 51.7%# Non Success 207 207 204 227 247 308 267
Success in ACC 111
2003/FA 2004/FA 2005/FA 2006/FA 2007/FA 2008/FA 2009/FA
% Female Success 63.3% 57.5% 58.8% 57.7% 57.3% 51.8% 58.7%
Female Success 145 111 104 123 114 115 105
% Male Success 59.8% 56.8% 53.2% 53.6% 47.2% 39.1% 42.1%
Male Success 180 163 149 158 145 127 133
Success in Math 010
2003/FA 2004/FA 2005/FA 2006/FA 2007/FA 2008/FA 2009/FA% Success 53.6% 46.3% 47.3% 45.7% 44.8% 43.3% 47.4%
Success 310 266 276 293 297 288 344
% Non Success 46.4% 53.7% 52.7% 54.3% 55.2% 56.7% 52.6%
Non Success 268 309 307 348 366 377 381
Success in Math 010
2003/FA 2004/FA 2005/FA 2006/FA 2007/FA 2008/FA 2009/FA% African American Success
42.6% 37.7% 38.5% 25.8% 26.9% 29.9% 34.7%
African American Success
43 46 40 33 45 56 51
% Caucasian Success
58.2% 51.8% 50.3% 52.7% 53.5% 48.4% 52.0%
Caucasian Success 202 184 180 217 206 180 141
Benefits
• Build a bridge between Standards 7 and 14.
• Better data.
• By putting data in the hands of faculty, have them actively engaged with using data in decisions/planning.
• IR time better used.
• Annual planning cycle developed.• Built a culture of assessment in
several of the academic divisions.• Curricular changes that align with
graduation initiative.• Curricular and program
improvement.• Created a college-wide model for
improvement of student learning.
Evolution of the Dashboard
• Creation of a Student Success DashboardMetrics:
Course level success and retention (Developmental and College-Level)
Persistence (fall to spring and fall to fall) Progression of various cohorts of students College level success in Math or English
after Developmental Math or English Graduation Transfer
Graphic Representation for the SSD
Graphic Representation for the SSD
Final Thoughts
It’s not perfect, but it works for us. Do the research on which tools
are appropriate for your college Assessment of the core curriculum Launching of assessment software It all starts with asking the right
question PRR 2010
Questions
Presenters
Mr. H. Leon Hill
Director of Institutional Research
Dr. Joan E. Brookshire
Associate Vice President of Academic Affairs