115
The Teaching-Learning Cycle: Using Student Learning Outcome Results to Improve Teaching and Learning Workshop Activities & Resource

Student Learning Outcomes—A Focus on Results

Embed Size (px)

Citation preview

Page 1: Student Learning Outcomes—A Focus on Results

The Teaching-Learning Cycle:

Using Student Learning Outcome Results

to Improve Teaching and Learning

Workshop Activities & Resource Materials

Page 2: Student Learning Outcomes—A Focus on Results

Bill ScrogginsNovember 2004

2

Page 3: Student Learning Outcomes—A Focus on Results

Table of Contents

Student Learning Outcomes at the Lesson Level............................................................................1Student Learning Outcomes at the Course Level: From Course Objectives to SLOs.....................2Primary Trait Analysis: Statements of Grading Criteria..................................................................4Selecting the Assessment Method: Authentic Assessment and Deep Learning..............................6Norming or Inter-Rater Reliability: Assuring Consistency of Grading Among Faculty.................8The Assessment Report: Sharing the Results of Student Learning Outcomes................................8Program Level Student Learning Outcomes....................................................................................9Direct and Indirect Measures of Student Learning Outcomes.......................................................10Identifying Program Competencies—External and Internal Sources............................................11Strategies for Direct Assessment of Program SLOs: Mosaic and Capstone Approaches.............11General Education Student Learning Outcomes............................................................................13Conclusion.....................................................................................................................................14

Appendices

Appendix 1 – Good Practices in Assessing Student Learning Outcomes.....................................15Appendix 2 – Activity 3: Writing Student Learning Outcomes....................................................22Appendix 3 – Developing and Applying Rubrics..........................................................................23Appendix 4 – Examples of Scoring Rubrics..................................................................................28Appendix 5 – Activities 4 & 5: Building and Using a Grading Rubric.........................................29Appendix 6 – The Case for Authentic Assessment by Grant Wiggins ........................................30Appendix 7 -- State and National Standards, Academic & Vocational Competencies.................32Appendix 8 – Assessment Report Examples.................................................................................36Appendix 9 – Assessment Plan Examples Internet Sites...............................................................40Appendix 10 – Activity 5 – Program SLOs from Competency Statements..................................41Appendix 11 – Examples of Program Assessment Reports...........................................................42Appendix 12 – General Education Student Learning Outcomes...................................................44Appendix 13—Resources and References for Student Learning Outcomes Assessment.............52

Endnotes.........................................................................................................................................57

URL for this document: http://cai.cc.ca.us/workshops/SLOFocusOnResults.doc

For further information contact: Bill ScrogginsInterim PresidentModesto Junior [email protected]

Page 4: Student Learning Outcomes—A Focus on Results

The Teaching-Learning CycleUsing Student Learning Outcome Results to Improve Teaching & Learning

Since the Accrediting Commission identified “measuring student learning outcomes” as the focus of the latest revision of the WASC standards, many of us have been struggling with what we are expected to do differently.i Whatever we do to implement Student Learning Outcomes, this initiative must be seen to add value to the teaching and learning process—value that clearly outweighs the task of constructing SLOs. Those of us who have taught for years consider that we already measure student learning. However, I have come to believe that SLOs really do have a new and useful emphasis that can be best captured by one word: results—collecting them, sharing them, and using them to improve both learning and the operation of our colleges. This series of reflections are intended to address getting useful results from the SLO process—maximizing utility and minimizing futility. (That little “f” really makes a difference, doesn’t it?)

Student Learning Outcomes at the Lesson Level

As we teach each lesson and grade the related student assignments, we typically have a clear concept of the results expected, and we have defined methods for assessing student work and assigning grades. However, there are several things that we typically don’t do that can potentially improve student learning. While many of us do give students written learning objectives for each lesson, we usually do not write down criteria for grading nor share them with students—other than how total points relate to the final grade in the course.

In listening to practitioners of SLOs such as Lisa Brewsterii, a Speech teacher at San Diego Miramar College, and Janet Fulksiii, a Microbiology teacher at Bakersfield College, it is clear that SLOs can become a powerful pedagogical tool by:

sharing grading criteria with students, getting students to use these criteria as a way

to better understand the material, and having students evaluate their own and each

other’s work.

Activity 1In small groups by discipline or cluster of related disciplines, discuss how you develop grading criteria.

Do you write down your grading criteria for each assignment? How consistent are you in applying your grading criteria? Do you use the results of student assessment to improve your grading criteria? Do you communicate your grading criteria to students? Before or after the assignment? Do you encourage students to apply the grading criteria to their own work? Do you involve students in developing or modifying your grading criteria? Do you share your grading criteria with other faculty?

Results of SLOs at the Lesson LevelMost of us have criteria for grading student learning for the individual objectives of each lesson we teach—but we may not write these down or share them with students. Here’s an example:Lesson Learning Objective: Describe and draw the four vibrations of carbon dioxide and show how IR light is absorbed by CO2.Sample Graded Question: What two types of motion are caused by the absorbance of IR light by CO2? Draw one of these motions.Grading Criteria:Full credit: Student names “bending” and “stretching” and draws ball-and-stick models with arrows up-and-down for bending and side-to-side for stretching.Deductions: 25% for one name missing; 50% for both; 25% for wrong or missing arrows; 50% for no drawing.Results of Grading Student Work:82% earned full credit; 5% confused arrows; 13% had no drawing.Action to Improve Learning:The greatest deficiency seems to be drawing, so do in-class drawing exercise.

Page 5: Student Learning Outcomes—A Focus on Results

Aggregating the feedback from grading student assignments can provide valuable insight into areas in need of improvement. With all the demands on our time, we may not give adequate attention to mining this valuable source of information for improvement of the teaching and learning process. One of the challenges that the new accreditation standards present is creating an assessment plan that outlines objectives, grading criteria, results of assessing student work, and how we use those results to improve student learning.

Of course, part of improving student learning is improving the way we teach. This inevitable outcome can potentially be threatening to faculty members. However, when these issues have been raised in workshops with faculty, the result has generally been a serious engagement in discussions of teaching methods to improve authentic, deep learning. iv It is extremely important to build environments for discussing the improvement of student learning which are positive and reinforcing. Several colleges have made explicit commitments to this principle.v (The endnote references include approaches by Palomar College in California, College of DuPage in Illinois, and the American Association of Higher Education.)

Activity 2Read the following resource documents (see Appendix 1) and join in the group discussion on “Good Practices for Assessment of Student Learning Outcomes.”

“An Assessment Manifesto” by College of DuPage (IL)“9 Principles of Good Practice for Assessing Student Learning” by AAHE“Palomar College Statement of Principles on Assessment” from Palomar College (CA)“Closing the Loop—Seven Common (Mis)Perceptions About Outcomes Assessment” by Tom Angelo“Five Myths of ‘Assessment’” by David Clement, faculty member, Monterey Peninsula College

Student Learning Outcomes at the Course Level: From Course Objectives to SLOs

Beyond the lesson level, we must address results of student learning at the course level. Moreover, we should do so for all sections of each course, meaning collaboration among the full- and part-time faculty teaching the course. In stating the desired student learning outcomes, we have the advantage of agreed-upon student objectives in the course outline.

A great deal of energy has been expended in discussing the difference between a course objective and a student learning outcome. The difference may be clearer when viewed in the context of producing assessment results that 1) provide useful feedback to improve the teaching and learning process and 2) provide useful information to improve college practices. SLOs more clearly connect with how the instructor will evaluate student work to determine if the objective has been met. When we write an assignment, we provide a context in which the student will respond and we evaluate the response based on criteria we use to judge if the student has met the objective—usually we have at least a mental construct of minimum acceptable performance standards. These are the two additional pieces that transform an objective into an SLO. Here’s how it might work.

If course objectives have been written well, they will be complete, measurable, and rigorous. In practice, as faculty look more closely at the criteria and methods to assess these objectives, changes often result. To “operationalize” an objective for assessment purposes, that is, to transform it into a statement of desired student learning outcomes, typically we must address:

2

Page 6: Student Learning Outcomes—A Focus on Results

1) the stated objectives in terms of acquired knowledge, skill or values (hopefully, the existing course objectives),

2) the context or conditions under which the student will be expected to apply the knowledge, skill or values, and

3) the primary traits which will be used in assessing student performance.

Below are some examples of “robust course objectives” or “statements of desired student learning outcomes.” (Note that this difference is largely semantic. Some colleges have chosen to put SLO statements in course outlines as an enhancement of the objectives, while others have built statements of desired SLOs into a departmental assessment plan, typically related to program review.) Whatever vehicle the college uses to operationalize course objectives to SLOs, it must be done collaboratively among faculty who teach the course.

Examples of Course Objectives Transformed Into Student Learning Outcomes

Course Objective Statement of Desired SLOWrite well-organized, accurate and significant content. (English)

Context: Given an in-class writing task based on an assigned reading,Objective: demonstrate appropriate and competent writing whichTraits: states a thesis, supports assertions, maintains unity of thought and

purpose, is organized, and is technically correct in paragraph composition, sentence structure, grammar, spelling, and word use.

Analyze behavior following the major accepted theories. (Psychology)

Context: Given a particular behavior and its context (e.g., playing incessantly with one’s hair when under pressure in the presence of the opposite sex),

Objective: describe how the perspectives of behaviorism, humanistic, psychoanalytic, and biological psychology would interpret that behavior and what methods might each use to alter that behavior.

Traits: Include theoretical basis, description of causality, and treatment regimen.Understand and apply the scientific method. (Biology)

Context: Given a hypothesis,Objective: design experiments and interpret data according to the scientific method

in order to evaluate the hypothesis. Traits: Include the ability to approach the scientific method in a variety of ways,

formulate questions, design experiments that answer the questions; and manipulate and evaluate the experimental data to reach conclusions.

Compare and contrast the text and film versions of a literary work. (Film)

Context: After viewing an assigned film based on a literary text,Objective: write a review of the film.Traits: Include an appraisal of the director’s selection and effective translation

of content from the literary text and the dominant tone the director seems to be trying to achieve, supporting each statement with detail from the text and film and your personal reaction to the cited scenes.

Activity 3Perform the “Writing Student Learning Outcomes” exercise in Appendix 3. Review the first example. Then for the second course objective, complete the Performance Context, Measurable Objective, and Primary Traits. Finally, select an objective from a course in your discipline and construct the three-part SLO statement.

3

Page 7: Student Learning Outcomes—A Focus on Results

Building a RubricStart with expectations for satisfactory work for each trait such as Organization in the table to the left: Ideas generally related to one

another and to the focus, but may have some unrelated material

Adequate introduction and conclusion

Some attempt at transitionsThen stretch up to excellent and down to unsatisfactory.

Primary Trait Analysis: Statements of Grading Criteria

Primary traits are the characteristics that are evaluated in assessing student work. Identifying primary traits for a given assignment involved listing those specific components that, taken together, make up a complete piece of work. They are the collection of things that we as teachers look for when we grade student work.

Definition of Primary Trait AssessmentPrimary trait assessment is a method of explicitly stating the criteria and standards for evaluation of student performance of an assignment or test. The professor identifies the traits that will be evaluated, and ranks the student's performance of each trait on a scale of "most effective" to "least effective" realization of the assignment goals. On this scale, the level of the student's performance is explicitly ranked so that the student knows how she is being evaluated. The instructor has created the scale for direct application to the assignment the student is performing so that if the entire class does poorly on the assignment, it is clear to the instructor what difficulties the class may share with one another. This recursive feedback of primary trait assessment can be used to inform classroom and departmental improvement. vi

While “primary traits” are the categories into which we can sort competencies when we evaluate student work, we look for specific levels of performance in each of these areas. For example, an essay might be rated on development, organization, style, and mechanics. These primary traits are then rated on some sort of a scale—as simple as A/B/C/D/F or more descriptive as excellent/superior/satisfactory/poor/unsatisfactory. Occasionally, points are given based on this scale. The challenge presented by the Student Learning Outcomes process is to write down those observable student performance characteristics in an explicit way for each of the primary traits we have identified. This system, known as a “grading rubric,” can be used to grade student work collected through all manner of assessment methods.vii

Template for a Grading Rubric: Primary Traits and Observable Characteristics

Trait Excellent Superior Satisfactory Poor Unsatisfactory

Development

Organization

Style

Mechanics

Rubrics can be applied in total by specifically rating each primary trait (an “analytic” grading rubric) or holistically (using the rubric as a guide to determine the overall rating of excellent, satisfactory, or unsatisfactory—or whatever performance levels have been agreed upon). An example is given below.

4

Page 8: Student Learning Outcomes—A Focus on Results

Primary Trait Grading of Math Problem Solvingviii

Trait 3 points 2 points 1 point 0 pointsUnderstanding complete understanding of

the problem in the problem statement section as well as in the development of the plan and interpretation of the solution

good understanding of the problem in the problem statement section. Some minor point(s) of the problem may be overlooked in the problem statement, the development of the plan, or the interpretation of the solution

minimal understanding of the problem; the problem statement may be unclear to the reader. The plan and/or interpretation of the solution overlooks significant parts of the problem

no understanding of the problem; the problem statement section does not address the problem or may even be missing. The plan and discussion of the solution have nothing to do with the problem

Plan plan is clearly articulated AND will lead to a correct solution

plan is articulated reasonably well and correct OR may contain a minor flaw based on a correct interpretation of the problem

plan is not clearly presented OR only partially correct based on a correct/partially correct understanding of the problem

no plan OR the plan is completely incorrect

Solution solution is correct AND clearly labeled OR though the solution is incorrect it is the expected outcome of a slightly flawed plan that is correctly implemented

solution is incorrect due to a minor error in implementation of either a correct or incorrect plan OR solution is not clearly labeled

solution is incorrect due to a significant error in implementation of either a correct or incorrect plan

no solution is given

Presentation overall appearance of the paper is neat and easy to read, and all pertinent information can be readily found

paper is hard to read OR pertinent information is hard to find

Holistic Grading of Math Problem Solving viii

Trait 3 points 2 points 1 point 0 points

Analyzed holistically

All of the following characteristics must be present: answer is correct;

explanation is clear and complete;

explanation includes complete implementation of a mathematically correct plan

Exactly one of the following characteristics is present: answer is incorrect due to

a minor flaw in plan or an algebraic error;

explanation lacks clarity; explanation is incomplete

Exactly two of the characteristics in the 2-point section are present OR One or more of the following characteristics are present. answer is incorrect due to

a major flaw in the plan; explanation lacks clarity

or is incomplete but does indicate some correct and relevant reasoning;

plan is partially implemented and no solution is provided

All of the following characteristics must be present: answer is incorrect; explanation, if any,

uses irrelevant arguments;

no plan for solution is attempted beyond just copying data given in the problem statement

Grading rubrics can be applied to a wide variety of subjects and used in association with a range of assessment techniques. (See the endnote on rubrics for references to good practices for using rubrics and for a range of examples of rubrics at a variety of colleges and across several disciplines.)

Before doing these two activities on rubrics, read “Developing and Applying Rubrics” by Mary Allen in Appendix 3. If possible, review some of the sample rubrics listed in Appendix 4.Activity 4: Building a RubricUsing the grid in Appendix 5A, select or write an SLO, identify Primary Traits, and then decide on “observables” for each assessment levelActivity 5: Using a Grading Rubric and Norming the ResultsUse the English rubric in Appendix 5B to grade the sample student essay in Appendix 5C. Compare your results with colleagues who graded the same paper. Where were your assessments different? Can you come to agreement on the overall rating of the paper?

5

Page 9: Student Learning Outcomes—A Focus on Results

To this point we have discussed stating the desired student learning outcome and developing a grading rubric. These are the beginning steps that can lead us toward collecting and using the results of measured student learning outcomes. A road map of a possible “SLO Assessment Plan” is shown in the diagram below.

Selecting the Assessment Method: Authentic Assessment and Deep Learning

The next logical question is “What assessment method should be used?” There are certainly a wide variety of methods for determining whether or not a student has demonstrated learning of a particular objective.

Summary of Tools for Direct Assessment of Student Learningix

Capstone Project/Course—a project or courses which, in addition to a full complement of instructional objectives, also serves as primary vehicle of student assessment for the course or program.

Criterion-Referenced Tests—a measurement of achievement of specific criteria or skills in terms of absolute levels of mastery. The focus is on performance of an individual as measured against a standard or criteria rather than against performance of others who take the same test, as with norm-referenced tests.

Norm-Referenced Test—an objective test that is standardized on a group of individuals whose performance is evaluated in relation to the performance of others; contrasted with criterion-referenced test.

Portfolio—a collection of student work organized around a specific goal, e.g., set of standards or benchmarks or instructional objectives); it can contain items such as handouts, essays, rough drafts, final copies, artwork, reports, photographs, graphs, charts, videotapes, audiotapes, notes, anecdotal records, and recommendations and reviews; each item in the portfolio provides a portion of the evidence needed to show that the goal has been attained.

Performance Assessments—activities in which students are required to demonstrate their level of competence or knowledge by creating a product or response scored so as to capture not just the "right answer", but also the reasonableness of the procedure used to carry out the task or solve the problem.

Rating Scales—subjective assessments made on predetermined criteria in the form of a scale. Rating scales include numerical scales or descriptive scales. Forced choice rating scales require that the rater determine whether an individual demonstrates more of one trait than another.

Simulation—a competency based measure whereby pre-operationalized abilities are measured in most direct, real-world approach. Simulation is primarily utilized to approximate the results of performance appraisal, but when–due to the target competency involved, logistical problems, or cost–direct demonstration of the student skill is impractical.

Activity 6Read the article “The Case for Authentic Assessment” by Grant Wiggins in Appendix 6. Discuss the assessment methods you use in your classes. What methods do you use? How effective do you find them?

Course Level TLC: Elements of an Assessment Plan Statement of Desired SLO Faculty Collaboration Course → Context or → Primary → Observables for Each → Assessment → Norm Among → Evaluate → Compile → Use FeedbackObjective Conditions Traits Performance Level Method Selected Instructors Student Work Results for Improvement Grading Rubric Assessment Report

(Compiled for Each Desired SLO)

6

Page 10: Student Learning Outcomes—A Focus on Results

Activity 7View the film “A Private Universex.” Discuss the implications for producing and assessing deep learning.

As I have listened to faculty discuss assessment methods (at six statewide California Assessment Institutes, eight regional RP/CAI workshops, and our own college’s summer institute on SLOs), I have come to several conclusions:

School-of-Education level discussions of assessment instruments are not well received.

Faculty are eager to talk about the challenges they experience in assessing students.

Discussions often turn to great stuff such as authentic assessment and deep learning.

Most faculty use a rather narrow range of methods—but use them well.

Faculty will more often try another assessment technique if recommended by a colleague.

Many faculty use assessments that need just slight enhancement to yield SLO results.

A few specifics on the last point may help:

One vocational department teaches portfolios in its introductory course—and uses portfolios when doing faculty career advising—but does not follow through by having students add to the portfolio as competencies are acquired in subsequent courses. The capstone course in this department has students build a portfolio as part of preparing to enter the job market, but there is no connection with the portfolio in the intro class nor is there a grading rubric.

One department has a clinical component in which students are evaluated using a rating sheet on their hands-on competencies. The department has complained about needing feedback from clinical to the theory courses, but has not consistently used the results of the rating sheets for this purpose. The competencies taught in the theory course are fairly well aligned with those assessed in clinical but could be improved.

Faculty in one of the social science departments have worked on departmental standards for term papers to the point of a primary trait analysis and meet regularly to discuss grading of term papers but have not filled in the observables to establish a rubric.

The English department has a grading rubric for written essays, and full- and part-time faculty have regular norming sessions to improve consistency of grading, but the system has only been used for two courses, freshman comp and its prerequisite.

Based on these observations, my recommendation is to start with these good things that faculty are doing, get them engaged in talking about grading (Effective Grading: A Tool for Learning and Assessment by Barbara Walvoord and Virginia Anderson has been great for this), get faculty to share assessment strategies with one another—especially across disciplines, and provide the support for moving these good existing assessment practices to the next level.

7

Page 11: Student Learning Outcomes—A Focus on Results

Norming or Inter-Rater Reliability: Assuring Consistency of Grading Among Faculty

Whatever method is chosen to assess student learning and apply the agreed-upon grading rubric, faculty who teach sections of the course should work together to assure that the results of grading student work are consistent. This process is known as “norming” or “inter-rater reliability” and has been used in a variety of venues including construction of standardized tests, evaluating placement test writing samples and ranking grant proposals. An explicit process for establishing inter-rater reliability would be to have evaluators use the grading rubric on a series of student assignments and then evaluate the extent of agreement using standard statistical measures. (The kappa statistic, the chi square test, the Pearson correlation coefficient, and percent agreement have all been used under various circumstances.) Agreement can be improved through discussion and training. Norming can be performed informally by having regular discussions among faculty raters, reviewing and debating examples related to the observables in the grading rubric until consensus is reached.xi

With the statement of the desired student learning outcome in place, with the grading rubric established and normed, the results collected can be powerful information for improving student learning—and may provide the basis for directing college resources in areas to address the learning gaps identified.

The Assessment Report: Sharing the Results of Student Learning Outcomes

A sensitive aspect of the discussion of Student Learning Outcomes has been how the information is to be used. Most importantly, the significance of the results relates directly to improving teaching and learning. Most of that improvement lies with faculty—curriculum design, pedagogy, learning environment, assessment methods and the like. The rest is in the hands of the college’s support system—providing facilities, equipment, student services and so on—to the instructional program to make those improvements identified by SLO results. To the extent that we can build an Assessment Report that focuses on the instructional program level—helping faculty improve student learning and identifying needed college resources, college faculty and staff will buy into the process. The examples below illustrate a few key points:

An analysis of the results—by faculty, particularly all program faculty—must accompany the results.

Results can be listed completely or summarized in narrative form.

Specific actions to be taken as a consequence of the results should be described.

Results often contradict our assumptions of how and what students learn.

Use of SLO results can be effectively centered in the instructional program as the locus of change.

Simple presentations of results form elegant evidence for accreditation.

The key components of the Assessment Plan are the student learning outcomes statements and the assessment methods used for each. The plan often includes benchmarks that indicate the incremental gains expected in the assessment results. The essential features of the Assessment

8

Page 12: Student Learning Outcomes—A Focus on Results

Report are a summary of the results of student evaluations, an analysis of those findings, and a summary of the actions taken to improve the student assessment performance. The diagram below summarizes the elements of an effective Assessment Plan and the resulting Assessment Report. Examples of Assessment Reports are shown in the Appendix.

Course Assessment ReportDepartment __________________________________ Term & Year __________________Course Name and Number________________________________________________________

Student Learning Outcome Statements Assessment Method Description (attach rubric)1. □ Capstone Project □ Embedded Test Question □ Portfolio

□ Performance Assessment □ Rating Scale □ Other

2. □ Capstone Project □ Embedded Test Question □ Portfolio□ Performance Assessment □ Rating Scale □ Other

3. □ Capstone Project □ Embedded Test Question □ Portfolio□ Performance Assessment □ Rating Scale □ Other

Assessment Results Analysis & Actions Taken1.2.3.

Program Level Student Learning Outcomes

The term “program” here refers to core required courses for occupational programs and lower division major preparation for transfer programs. Many professional societies have standards or competencies that can be used as the basis for program level SLOs. (Some examples are referenced in the endnotes and summarized in the Appendix.) Often, however, these competencies are in the form of discrete skills rather than more global outcomes that would lend themselves to summaries of student learning by those who have completed those programs. An example of aggregating detailed standards into more comprehensive SLO statements is this sample taken from the American Psychological Association.xii

Example of Aggregation of Specific Program Competencies into a Program Student Learning Outcome

Global Student Learning Outcome: Use critical thinking effectively.

Specific Competencies:a. Evaluate the quality of information, including differentiating empirical evidence from speculation and the

probable from the improbable.b. Identify and evaluate the source, context, and credibility of information.c. Recognize and defend against common fallacies in thinking.d. Avoid being swayed by appeals to emotion or authority.e. Evaluate popular media reports of psychological research.f. Demonstrate an attitude of critical thinking that includes persistence, open-mindedness, tolerance for

ambiguity and intellectual engagement.g. Make linkages or connections between diverse facts, theories, and observations.From Undergraduate Psychology Major Learning Goals And Outcomes: A Report, American Psychological Association, March 2002xii

9

Page 13: Student Learning Outcomes—A Focus on Results

Direct and Indirect Measures of Student Learning Outcomes

One consideration of the method is whether it is a direct or indirect measure of student learning. Direct assessment includes using criteria that assesses or measures student learning directly such as writing an essay, giving a speech, solving problems, using a capstone experience or evaluating a portfolio of student-created products. Indirect assessment examines student performance or behavior using criteria which, if accomplished, assume learning has taken place. Examples include surveys of students and employers, exit interviews of graduates, retention and transfer studies, and job placement data.xiii

Indirect measures are often thought of as outputs: course completions, degrees, certificates, and transfers for example. These are the institutional measures of accountability measured by the California Community College’s Partnership for Excellence initiative. These measures are often key indicators of success for a program, as exemplified below.

Example of the Use of Direct and Indirect Measures of Student LearningFrom Oklahoma State University: http://www.okstate.edu/assess

Student Outcomes for Geology. Upon degree completion, students will Demonstrate understanding of the basic concepts in eight subject areas: physical geology,

historical geology, mineralogy, petrology, sedimentology/stratigraphy, geomorphology, paleontology, and structural geology;

Direct

Demonstrate technical skills in the collection and analysis of geologic data, critical-thinking skills, plus written and verbal communication skills; Direct

Apply geologic knowledge and skills to a range of problems faced by business, industry, government; Direct

Gain employment in the geology profession or advance to graduate studies in geology or an allied field. Indirect

Identifying Program Competencies—External & Internal Sources

One of the new activities that the accreditation standards require is the construction of competencies for each of our degree and certificate programs. One way to approach this task is to begin with the competencies or standards that are used by state or national professional organizations or licensing/credentialing bodies. These groups span a wide range of disciplines both academic and vocational. Some examples:

The American Welding Society publishes welding codes and standards on which an extensive AWS curriculum is based. Many community colleges give students AWS certification tests based on these competencies.

The California Board of Registered Nursing uses standards of competent performance and tests nursing applicants for licensure in many nursing fields.

The American Psychological Association recently published Undergraduate Psychology Learning Goals and Outcomes that lists both global student learning outcomes and detailed competencies for both the psych major and liberal studies students.

The California State Board of Barbering and Cosmetology tests graduates for licensure based on curriculum standards enacted in Title 16 of the California Code of Regulations.

Program: a sequence of courses leading to an educational goal in accord with the mission of the California Community Colleges: transfer, associate degree (both of which have major and general education components), certificate, basic skills, or workforce skill upgrades.

10

Page 14: Student Learning Outcomes—A Focus on Results

Links to these and other competencies and standards are found in the Appendix. While an individual program may not teach to all the outcomes that these groups specify, the lists are an excellent starting point. Not all programs have industry associations or professional societies who write standards. Such programs may need to consult local vocational advisory committees or faculty colleagues at neighboring institutions.

Strategies for Direct Assessment of Program SLOs: Mosaic and Capstone Approaches

The Mosaic Approach. Assessment of program-level student learning outcomes can be approached by assessing either detailed competencies or more global program learning goals. (Look again at the example in the table at the top of page 10 for the distinction between a global SLO statement and its detailed competencies.) Assessing detailed competencies views the acquiring of knowledge, skills and attitudes as taking place rather like assembling a complex mosaic from individual colored tiles. It is a more analytical model and provides more targeted information about student learning. However, the extent of the effort to find authentic assessments for a large number of “mosaic” competencies, get agreement among program faculty on those assessments, construct rubrics, norm on samples of student work, and then collect and analyze the data may stretch program resources to the breaking point. Furthermore, the acquisition of small, discrete packets of knowledge may not lead the student to acquire a more integrated understanding that provides needed applicability to the next step in that student’s career, be it transfer or directly entering the job market. Consequently, more holistic assessments are often preferred, such as capstone courses or internships.

The Program Audit. Even if an integrated assessment is used at the end of the program, it is useful to identify where in the curriculum each SLO (or even individual competency) is acquired. Furthermore, learning most often occurs in cycles: the student will be exposed to a topic, then later gain competency in that area, and finally master that skill. Doing a program audit of exactly where SLOs and/or competencies are introduced, reinforced, and mastered in the program course offerings is a useful exercise. A template for such a program audit is shown below. Several colleges use such a model to connect individual course learning outcomes statements with the more global program level learning outcomes statements.

Curriculum Audit Grid: Identifying Specific Competencies in the Program “Mosaic”

Outcomes Course201 202 205 207 251 260 313 314 320 425

1. Recognize and articulate approaches to psychology I E R2. Independently design valid experiments. I E R3. Articulate a philosophy of psych/Christian integration. I I R R R R R R EModeled after Hatfield (1999). I = Introduced E = Emphasized R = Reinforced

Example from “A Program Guide for Outcomes Assessment” by Geneva College, April 2000: http://www.geneva.edu/academics/assessment/oaguide.pdf

Activity 8 – Synthesizing Global Program SLO Statements from Course ObjectivesAppendix 10 presents an exercise in synthesizing course SLO statements into program SLO statements. Do this exercise in small groups and then compare results between groups.

11

Page 15: Student Learning Outcomes—A Focus on Results

The Capstone Approach. The above examples focus on the connection between global program outcomes and the individual competencies students acquire in courses. These individual competencies may be measured through assessment instruments embedded in those courses, be they test questions, performance evaluations, or any other tool for which a normed rubric has been generated. Quite often, program outcomes are determined by a more holistic measure such as a portfolio or other capstone assignment, a survey, or indirect measures such as success in transfer or employment. While these program assessment techniques are considerably simpler to construct and carry out, they may not provide the diagnostic feedback which would enable more targeted improvements in teaching and learning. The table below gives an example of such a holistic or capstone program assessment report.

PARKLAND COLLEGE ACADEMIC ASSESSMENT (Excerpts)DEPARTMENT: Fine and Applied Arts PROGRAM: Mass CommunicationMethods:

Indirect Assessment Measures

Intended Outcomes(Objectives)

Assessment Criteria & Methods

(Expected Results)

Actual Results Analysis & Action

1. Students will demonstrate proficiency in employable Mass Communication skills.

1. Students will demonstrate desired mass communication competencies as shown by individual portfolios, when assessed by representatives from industry, as reported on MC Portfolio Evaluation form.

1. Written comments from industry representatives indicate that MC student's portfolio assessment ranked 4 (on a scale of 1 to 5-five being the highest score). Suggestions were to include more Web site graphics into curriculum.

1. Desktop Graphics program revised to include more experience in Web site graphics. Students designed graphics for current MC home page and links.

2. Students will demonstrate learning the basic concepts necessary to perform satisfactorily in Mass Communications entry-level jobs.

2. When surveyed using Parkland College Student Occupational Follow-Up Survey, graduates will describe satisfaction with their Mass Communication knowledge to recall, analyze, evaluate, and utilize basic concepts.

2. Feedback from employers and students strongly indicated that Visual Arts program option had become obsolete; preference is given to graduates with Desktop Publishing skills.

2. Visual Arts program option shelved.

3. Students in the Mass Communication A.A. program will have the knowledge to successfully complete a Bachelors degree in Mass Communication.

3. Four-year institutions will report of a 75 percent acceptance rate into Mass Communication programs.

3. U of I Coordinator of Transfer Articulation reported that out of 29 applicants from other schools to Graphics a Mass Com student was the only admit.

3. Continue to gather/monitor data. Investigate how many Parkland Graphics students applied.

Activity 9 – Creating Program SLO Statements and Performing a Program Audit With a group of faculty from your discipline, write a set of Program SLOs for a degree or

certificate in your area. Assemble the outlines of record for the required courses for a degree or certificate in your

discipline. List the course objectives for all of these courses, preferably after having revised them to

“robust” objectives/student learning outcomes as described previously. Identify which course objectives match with each Program SLO statement. Present the

results in a table format like that above. You may wish to categorize each course objective by the extent to which is moves students toward mastery of the Program SLO.

Pre/Post Tests Capstone exam/project Primary Trait Analysis Course Embedded Test Standardized Exams Professional Certification Portfolios Performance Assessment Other

Transfer/Employment Data Grad Surveys/Interviews Employer/Faculty Surveys

12

Page 16: Student Learning Outcomes—A Focus on Results

Summary of Program Level SLO Assessment StrategiesIndirect: implies that SLOs are achieved

Transfer Program completion Job placement Employer surveys Student exit surveys Licensure exams

Direct: students assessed on SLOs while in programCapstone strategies: Capstone course or project Standardized test: commercial or local, sampled

or comprehensive Internship/clinical workplace evaluation Portfolio: student- or instructor-generated

Mosaic strategies: embedded/program audit

General Education Student Learning Outcomes

Assessment of learning outcomes in general education can be approached rather like those for programs in the major. Most colleges write global learning statements and then break those down into specific competencies that are on the level of course objectives. Several examples are given in Appendix 12, including an audit grid for general education competencies.

California Community Colleges have three sets of general education patterns to offer to students: the associate degree pattern set by Title 5, the CSU GE-Breadth pattern, and IGETC. While these patterns are similar, they have significant differences. The competency statements found in the source documents for CSU GE-Breadth and IGETC can be a useful starting point for colleges beginning the process of constructing SLO statements for general education categories.

General Education Patterns Available to Students (Merced College Example)

1Implement Program

2Identify Indirect

Measures

6Decide on Program

Improvements

3Identify Direct

Measures

4Collect

Assessment Results

5Disseminate &

Reflect on Results

ProgramLevelTLC

13

Page 17: Student Learning Outcomes—A Focus on Results

Merced College AA CSU GE-Breadth IGETCA. Language & Rationality A1. Oral Communication

A2. Written CommunicationA3. Critical Thinking

1A. English Composition1B. Critical Thinking1C. Oral Communication

B. Natural Sciences B1. Physical ScienceB2. Life ScienceB3. Laboratory ActivityB4. Mathematics/Quantitative Reasoning

2. Mathematical Concepts & Quantitative Reasoning

C. Humanities C1. ArtsC2. Humanities

3A. Arts3B. Humanities

D. Social & Behavioral Sciences D. Social, Political & Economic Institutions & Behavior; Historical Background

4. Social & Behavioral Sciences

E. Livelong Understanding & Self-Development

E. Livelong Understanding & Self-Development 5A. Physical Science5B. Biological Science

F. History & Government 6. Language Other Than English

For more information refer to CSU Executive Order 595 and “IGETC Notes” 1, 2 and 3

Activity 10 – Writing Global SLO Statements with Specific Competencies for Each Review the models of general education student learning outcomes in Appendix 12 (assumes

knowledge of CCC GE, CSU GE-Breadth and IGETC patterns). For each college GE area, write a global student learning outcome statement. For each college GE area, write specific competency SLO statements under each of the

global SLO statements.

Activity 11 –Performing a General Education Program Audit Assemble the outlines of record for the courses approved in each GE area. List the course objectives for all of these courses, preferably after having revised them to

“robust” objectives/student learning outcomes as described previously. Identify which course objectives match with each GE SLO statement. Present the results in a

table format like that discussed previously. You may wish to categorize each course objective by the extent to which is moves students toward mastery of the AA/AS GE SLO: I = Introduced, E = Emphasized, or R = Reinforced.

Conclusion

In presenting preliminary findings to be published in an up-coming monograph, Jack Friedlander, Executive Vice President of Santa Barbara City College, concluded that most colleges around the country are still at the process level of developing SLOs. Nevertheless, there are many examples of excellent work on SLOs at colleges around the country, summarized in Appendix 13. These examples should provide colleges which are new to the Student Learning Outcomes process with the shared experiences of their colleagues so that climbing the learning curve can be facilitated. The climate in education today simply will not allow us to expend valuable time and energy on a process that will not yield useful results. Such results have the potential to allow faculty and others to engage in reflection about the process of teaching and learning and then use the insights they develop to adjust the teaching-learning-assessment process to optimize learning to the full extent possible. By having a clear path to those results, we can move ahead with taking the first few steps. But we need to keep our eye on the goal as we’re walking. Remember, utility can quickly become futility by adding a few f’s!

14

Page 18: Student Learning Outcomes—A Focus on Results

Appendix 1 – Good Practices in Assessing Student Learning Outcomes

Appendix 1A – An Assessment Manifesto by College of DuPage (IL)

This 10-point manifesto is taken from the end section of 500 Tips on Assessment by Sally Brown, Phil Race and Brenda Smith, published by Kogan Page in the Spring of 1996. We state some values which we believe should underpin assessment, whatever form it takes and whatever purpose it serves. Our thinking on these values owes a debt to the work of the Open Learning Foundation Assessment Issues Group, in which we all participated, and to the values adopted by the UK Staff and Educational Development Association for its Teacher Accreditation Scheme, and Fellowship Scheme.

1. Assessment should be based on an understanding of how students learn. Assessment should play a positive role in the learning experiences of students.

2. Assessment should accommodate individual differences in students. A diverse range of assessment instruments and processes should be employed, so as not to disadvantage any particular individual or group of learners. Assessment processes and instruments should accommodate and encourage creativity and originality shown by students.

3. The purposes of assessment need to be clearly explained. Staff, students, and the outside world need to be able to see why assessment is being used, and the rationale for choosing each individual form of assessment in its particular context.

4. Assessment needs to be valid. By this, we mean that assessment methods should be chosen which directly measure that which it is intended to measure, and not just a reflection in a different medium of the knowledge, skills or competences being assessed.

5. Assessment instruments and processes need to be reliable and consistent. As far as is possible, subjectivity should be eliminated, and assessment should be carried out in ways where the grades or scores that students are awarded are independent of the assessor who happens to mark their work. External examiners and moderators should be active contributors to assessment, rather than observers.

6. All assessment forms should allow students to receive feedback on their learning and their performance. Assessment should be a developmental activity. There should be no hidden agendas in assessment, and we should be prepared to justify to students the grades or scores we award them, and help students to work out how to improve. Even when summative forms of assessment are employed, students should be provided with feedback on their performance, and information to help them identify where their strengths and weaknesses are.

7. Assessment should provide staff and students with opportunities to reflect on their practice and their learning. Assessment instruments and processes should be the subject of continuous evaluation and adjustment. Monitoring and adjustment of the quality o f assessment should be built in to quality control processes in universities and professional bodies.

8. Assessment should be an integral component of course design, and not something bolted on afterwards. Teaching and learning elements of each course should be designed in the full knowledge of the sorts of assessment students will encounter, and be designed to help them show the outcomes of their learning under favorable conditions.

9. The amount of assessment should be appropriate. Students' learning should not be impeded by being driven by an overload of assessment requirements, nor should the quality of the teaching conducted by staff be impaired by excessive burdens of assessment tasks.

10. Assessment criteria need to be understandable, explicit and public. Students need to be able to tell what is expected of them in each form of assessment they encounter. Assessment criteria also need to be understandable to employers, and others in the outside world.

Appendix 1A 15

Page 19: Student Learning Outcomes—A Focus on Results

Appendix 1B – Palomar College (CA) Statement of Principles on Assessment

Why do Assessment?

Palomar’s Vision Statement projects a future in which "Palomar College judges its work and its programs and formulates its policies primarily on the basis of learning outcomes and has a comprehensive program for assessing those outcomes and responding to its findings." We adopted this strategic goal even before our accrediting body revised its accreditation standards to "focus on outcomes and accomplishments, embracing a model of accreditation which requires assessment of resources, processes, and outcomes at the institutional level." Thus our own commitment to assess student learning at the institutional level precedes, but complements, the mandates of accreditation. To carry out that commitment, Palomar will develop and continuously refine and improve an institutional framework for assessing student learning and using the information gained from such assessment to serve our students better.What is assessment?We mean by "assessment" "the systematic collection, analysis, interpretation, and use of information to understand and improve teaching and learning" (Tom Angelo).What is assessment for?At Palomar, we will use assessment primarily to understand, and thereby improve, student learning. More specifically, assessment can serve the following roles in the institution:

To provide improved feedback, guidance, and mentoring to students so as to help them better plan and execute their educational programs.

To provide improved feedback about student learning to support faculty in their work. To help us design and modify programs to better promote learning and student success. To develop common definitions and benchmarks for important student abilities that will

enable us to act more coherently and effectively to promote student learning. To help us understand how different groups of students experience the college differently

so as to adapt our courses and programs to the needs and capacities of all students.

To help us understand how our different courses and programs affect students over time so that we can better coordinate and sequence the student’s experience to produce more and deeper learning.

What is assessment not for?

Different institutions may, of course, use the tools of learning assessment differently. It will help to clarify the nature of Palomar’s commitment to learning assessment to specify some of the possible purposes of assessment that we will exclude from our approach.

We will not use assessment as an end in itself. Assessment that does not help us to promote student learning is a waste of time.

We will not use assessment of student learning punitively or as a means of determining faculty or staff salaries or rewards. The purpose of assessment is to evaluate student learning, not to reward or punish faculty or staff.

We will not use any single mode of assessment to answer all questions or strictly determine program decisions.

Appendix 1B 16

Page 20: Student Learning Outcomes—A Focus on Results

We will not use assessment in a way that will impinge upon the academic freedom or professional rights of faculty. Individual faculty members must continue to exercise their best professional judgment in matters of grading and discipline.

We will not assume that assessment can answer all questions about all students. We need not directly assess all students in order to learn about the effectiveness of our programs and policies.

We will not assume that assessment is quantitative. While numerical scales or rubrics (such as the four-point grading scale) can be useful, their accuracy always depends on the clear understanding of the concepts behind the numbers. Often the best indicator of student learning can be expressed better as a narrative or a performance than as a number.

We will not use assessment only to evaluate the end of the student’s experience or merely to be accountable to outside parties. Assessment must be ongoing observation of what we believe is important.

We will not assume that assessment is only grading.

Who will do assessment?

Palomar's faculty, in consultation with the entire college community, will shape and design institutional assessment activities and will identify the core knowledge and skills that our students need to master. The faculty will likewise develop benchmarks by which student progress can be evaluated. These will be ongoing processes, open to modification and improvement. Not all assessment need be done in individual classes, and not every faculty member need assess all of the core learning.

How will we use assessment?

The following guidelines will govern the methodology and approach we will employ at Palomar to institutional assessment:

We will always seek multiple judgments of student learning rather than a single standard. We will assess those skills and knowledge that our faculty, in consultation with the entire

college community, judges to be important and valuable. We will assess the ongoing progress of students throughout their experience at the

college.

Appendix 1B 17

Page 21: Student Learning Outcomes—A Focus on Results

Appendix 1C – AAHE Nine Principles of Good Practice for Assessing Student Learning

AAHE ASSESSMENT FORUM9 Principles of Good Practice for Assessing Student Learning

1. The assessment of student learning begins with educational values. Assessment is not an end in itself but a vehicle for educational improvement. Its effective practice, then, begins with and enacts a vision of the kinds of learning we most value for students and strive to help them achieve. Educational values should drive not only what we choose to assess but also how we do so. Where questions about educational mission and values are skipped over, assessment threatens to be an exercise in measuring what's easy, rather than a process of improving what we really care about.

2. Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time. Learning is a complex process. It entails not only what students know but what they can do with what they know; it involves not only knowledge and abilities but values, attitudes, and habits of mind that affect both academic success and performance beyond the classroom. Assessment should reflect these understandings by employing a diverse array of methods, including those that call for actual performance, using them over time so as to reveal change, growth, and increasing degrees of integration. Such an approach aims for a more complete and accurate picture of learning, and therefore firmer bases for improving our students' educational experience.

3. Assessment works best when the programs it seeks to improve have clear, explicitly stated purposes. Assessment is a goal-oriented process. It entails comparing educational performance with educational purposes and expectations -- those derived from the institution's mission, from faculty intentions in program and course design, and from knowledge of students' own goals. Where program purposes lack specificity or agreement, assessment as a process pushes a campus toward clarity about where to aim and what standards to apply; assessment also prompts attention to where and how program goals will be taught and learned. Clear, shared, implementable goals are the cornerstone for assessment that is focused and useful.

4. Assessment requires attention to outcomes but also and equally to the experiences that lead to those outcomes. Information about outcomes is of high importance; where students "end up" matters greatly. But to improve outcomes, we need to know about student experience along the way -- about the curricula, teaching, and kind of student effort that lead to particular outcomes. Assessment can help us understand which students learn best under what conditions; with such knowledge comes the capacity to improve the whole of their learning.  

5. Assessment works best when it is ongoing not episodic. Assessment is a process whose power is cumulative. Though isolated, "one-shot" assessment can be better than none, improvement is best fostered when assessment entails a linked series of activities undertaken over time. This may mean tracking the process of individual students, or of cohorts of students; it may mean collecting the same examples of student performance or using the same instrument semester after semester. The point is to monitor progress

Appendix 1C 18

Page 22: Student Learning Outcomes—A Focus on Results

toward intended goals in a spirit of continuous improvement. Along the way, the assessment process itself should be evaluated and refined in light of emerging insights.

6. Assessment fosters wider improvement when representatives from across the educational community are involved. Student learning is a campus-wide responsibility, and assessment is a way of enacting that responsibility. Thus, while assessment efforts may start small, the aim over time is to involve people from across the educational community. Faculty play an especially important role, but assessment's questions can't be fully addressed without participation by student-affairs educators, librarians, administrators, and students. Assessment may also involve individuals from beyond the campus (alumni/ae, trustees, employers) whose experience can enrich the sense of appropriate aims and standards for learning. Thus understood, assessment is not a task for small groups of experts but a collaborative activity; its aim is wider, better-informed attention to student learning by all parties with a stake in its improvement.

7. Assessment makes a difference when it begins with issues of use and illuminates questions that people really care about. Assessment recognizes the value of information in the process of improvement. But to be useful, information must be connected to issues or questions that people really care about. This implies assessment approaches that produce evidence that relevant parties will find credible, suggestive, and applicable to decisions that need to be made. It means thinking in advance about how the information will be used, and by whom. The point of assessment is not to gather data and return "results"; it is a process that starts with the questions of decision-makers, that involves them in the gathering and interpreting of data, and that informs and helps guide continuous improvement.

8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. Assessment alone changes little. Its greatest contribution comes on campuses where the quality of teaching and learning is visibly valued and worked at. On such campuses, the push to improve educational performance is a visible and primary goal of leadership; improving the quality of undergraduate education is central to the institution's planning, budgeting, and personnel decisions. On such campuses, information about learning outcomes is seen as an integral part of decision making, and avidly sought.

9. Through assessment, educators meet responsibilities to students and to the public. There is a compelling public stake in education. As educators, we have a responsibility to the publics that support or depend on us to provide information about the ways in which our students meet goals and expectations. But that responsibility goes beyond the reporting of such information; our deeper obligation -- to ourselves, our students, and society -- is to improve. Those to whom educators are accountable have a corresponding obligation to support such attempts at improvement.

Authors: Alexander W. Astin; Trudy W. Banta; K. Patricia Cross; Elaine El-Khawas; Peter T. Ewell; Pat Hutchings; Theodore J. Marchese; Kay M. McClenney; Marcia Mentkowski; Margaret A. Miller; E. Thomas Moran; Barbara D. WrightThis document was developed under the auspices of the AAHE Assessment Forum with support from the Fund for the Improvement of Postsecondary Education with additional support for publication and dissemination from the Exxon Education Foundation. Copies may be made without restriction.

Appendix 1C 19

Page 23: Student Learning Outcomes—A Focus on Results

Appendix 1D – “Closing the Loop” by Tom Angelo

Seven Common (Mis)Perceptions About Outcomes Assessment

1. We’re doing just fine without it. (Assessment is medicine only for the sick.)2. We’re already doing it. (Assessment is just old wine in new bottles.)3. We’re far too busy to do it. (Assessment is an “administrivial” burden.)4. The most important things we do can’t/shouldn’t be measured. (Assessment is too reductive and

quantitative.)5. We’d need more staff and lots more money to do assessment. (Assessment is too complex and

expensive.)6. They’ll use the results against us. (Assessment is a trick or a Trojan horse.)7. No one will care about or use what we find out. (Assessment is a waste of time.)

Seven Reasonable Responses to Those (Mis)Perceptions

1. We’re doing just fine without it.Okay, then let’s use assessment to find out what works, and to help us document and build on our successes.

2. We’re already doing it.Okay, then let’s audit all the assessments we already do to discover what we know and what we don’t.

3. We’re far too busy to do it.Okay, but since we’re already doing it, let’s use assessment to see where and how we can save time and effort.

4. The most important things we do can’t/shouldn’t be measured.And not everything measurable should be measured, but let’s see if we can agree on how we can tell when we’re succeeding in these most important things.

5. We’d need more staff and lots more money to do assessment.Since we’re unlikely to get more resources, how, what, and where can we piggyback, embed, and substitute?

6. They’ll use the results against us.They might. So, let’s build in strong safeguards against misuse before we agree to assess.

7. No one will care about or use what we find out.To avoid that, let’s agree not to do any assessments without a firm commitment from stakeholders to use the results.

Seven Transformative Guidelines for Using Assessment to Improve Teaching and Learning

1. Build shared trust. Begin by lowering social and interpersonal barriers to change.2. Build shared motivation. Collectively determine goals worth working toward and problems

worth solving—and consider the likely costs and benefits.3. Build a shared language. Develop a collective understanding of new concepts (mental models)

needed for transformation.4. Design backward and work forward. Design backward from the shared vision and long-term

goals to develop coherent outcomes, strategies, and activities.5. Think and act systematically. Understand the advantages and limitations of the larger system(s)

within which we operate and seek connections and applications to those larger worlds.6. Practice what we preach. Use what we have learned about individual and organizational

learning to inform and explain our efforts and strategies.7. Don’t assume, ask. Make the implicit explicit. Use assessment to focus on what matters most.

Appendix 1D 20

Page 24: Student Learning Outcomes—A Focus on Results

Appendix 1E – Five Myths of Assessment by David Clement, Monterey Peninsula College

Appendix 1E 21

Page 25: Student Learning Outcomes—A Focus on Results

s my campus management moves inexorably toward Learning Outcomes,

Outcomes Based Education (OBE), and Assessment rubrics, I realize that there are five major faculty objections, none of which has been adequately addressed. OBE, at one time or another, has appealed to both the Right and the Left. Its genesis was during the first Bush administration while its current adherents are more likely to be social utopians or professional administrators. The Right saw OBE as a means to accountability, productivity, and particularizing standards. The Left saw OBE as an engine for social change, attitude engineering, and infusing ideology into curriculum. That’s why the OBE vocabulary is such a

A

loopy conflation of edu-babble, computer jargon, and therapy-speak. I summarize the five objections below for the benefit of other faculty facing this latest, management-driven, educational fad. The quotations are all assertions made by our “Learning Outcomes Task Force.”

I. “Assessment rubrics and learning outcomes will not affect teacher evaluation.” Nonsense. No one can make such a guarantee. Assessment schemes and expected outcomes are easily adapted for use on teacher evaluation forms. For example, “How well did the teacher explain your class’s learning outcomes?” And “How well did the teacher’s learning activities facilitate class and college learning outcomes?” Maybe not this year or next, but “learning outcomes” are a technocrat’s idea of education: flow charts, graph paper, and scores.

II. “Assessment does not intrude on your classroom.” Of course it does, in the most fundamental way. Every competent teacher has goals and grading criteria for his or her classes; many of us have used such schemes as writing Instructional Objectives or setting Cognitive and Affective Domain goals. Where “Assessment” intrudes is by insisting that all learning is observable and measurable. This may be true in skill development or performance courses (nursing or cello), but it is clearly false in humanities or art courses. There, as one Joseph Conrad character said of those who travel to Africa, “The changes take place inside, you know.” How does a student come to realize that Mozart is better than Britney Spears or Michelangelo better than Thomas Kinkade? No chart of the measurable and observable will tell you, yet it surely involves learning.

III. “Learning outcomes do not compromise academic freedom.” Academic freedom is a complex issue but basically its practice insures that students will be exposed to various, academically legitimate yet contradictory ideas. That is, they will be drawn into “the Great Conversation,” not simply inoculated with a currently

prevailing orthodoxy. Uniformity of input is anathema to academic freedom; uniformity of outcome is inhuman. After over 30 years teaching, I still have no idea what any individual student will “get out” of a class.

IV. “All students can succeed.” This premise is idealistic but misguided. The only way to insure equal outcomes is to water down standards. All students must have equal opportunity, but each student is a unique and complex individual. The reasons for “success” or “failure” cannot be teased apart from the mysteries of personality and talent.

V. “There should be unanimous learning outcomes for the whole college.” Impossible as well as undesirable, and most disturbing when espousing nebulous, therapeutic or value-charged goals. One teacher may prize collaboration while another values self-reliance. One favors “Globalism” while another favors “Globalization.” One teacher is Green, another is Libertarian. This is as it should be. You simply can’t have a college commitment both to “diversity” and to “unanimity.” That’s hypocrisy. In college education (as in science), respectful, learned disagreement is an essential part of the process. OBE is also behaviorist, Skinnerian, concerned solely with INPUT and OUTPUT, ignoring what happens in between. Deep learning is private, invisible, and frequently ineffable. Often it is dangerous, upsetting, and unpredictable. You can’t put it on the Internet, and you can’t turn it into a PowerPoint magic lantern show. What I find is that OBE and Learning Outcomes and Assessment are not about education at all; they are about control. Nothing is more seductive to ideologues and to management than the prospect of creating a meaningless “jargon and data storm” to justify or conceal whatever they do. Where does it end? As William S. Burroughs said, “ . . . control can never be a means to any practical end . . .. It can never be a means to anything but more control . . . ” (133).

Work CitedBurroughs, William S. Naked Lunch. New York: Grove Press, 1956. Reissue edition 1992.

David Clemens has taught, part time and full, at Monterey Peninsula College since 1971. He has published in New Directions in Teaching,

Appendix 1E

Page 26: Student Learning Outcomes—A Focus on Results

San Francisco Chronicle, Teaching English in the Two Year Colleges, San Jose Mercury, New Morning, Informal Logic. Ten years Contributing Editor Media and Methods. Science and Academic Board of the Foundation for Research in Accelerating Change.

Appendix 1E 23

Page 27: Student Learning Outcomes—A Focus on Results

Appendix 2 – Activity #3: Writing Student Learning OutcomesReview the first example. Then for the second course objective, complete the Performance Context, Measurable Objective, and Primary Traits.

Finally, select an objective from a course in your discipline and construct the three-part SLO statement.

Course Objective Performance Context Measurable Objective Grading Criteria/ Primary Traits

Match the various types of sheet metal welding methods to the appropriate application.

Given specifications and materials requiring a weld,

evaluate the performance needs and match the welding method to the required application.

Welds should have a quality edge joint, meet design specifications, have an evenly positioned weld bead with good penetration, and have the minimum heat-affected zone to maximize strength of the weld.

Demonstrate and develop correct keyboarding techniques applicable to keyboarding by touch for speed and accuracy.

Appendix 2 24

Page 28: Student Learning Outcomes—A Focus on Results

Appendix 3 – Developing and Applying RubricsMary Allen, CSU Institute for Teaching & Learning, [email protected]

Scoring rubrics are explicit schemes for classifying products or behaviors into categories that vary along a continuum. They can be used to classify virtually any product or behavior, such as essays, research reports, portfolios, works of art, recitals, oral presentations, performances, and group activities. Judgments can be self-assessments by students; or judgments can be made by others, such as faculty, other students, fieldwork supervisors, and external reviewers. Rubrics can be used to provide formative feedback to students, to grade students, and/or to assess programs.

There are two major types of scoring rubrics:• Holistic scoring - one global, holistic score for a product or behavior• Analytic rubrics - separate, holistic scoring of specified characteristics of a product or

behavior

Holistic Rubric for Assessing Student EssaysInadequate The essay has at least one serious weakness. It may be unfocused,

underdeveloped, or rambling: Problems with the use of language seriouslyinterfere with the reader's ability to understand what is being communicated.

Developing The essay may be somewhat unfocused; underdeveloped, or rambling, but it doesCompetence have some coherence. Problems with the use of language occasionally interfere

with the reader's ability to understand what is being communicated:Acceptable The essay is generally focused and contains some development of ideas; but the

discussion may be simplistic or repetitive. The language lacks syntacticcomplexity and may contain occasional grammatical errors, but the reader is ableto understand what is being communicated.

Sophisticated The essay is focused and clearly organized, and it shows depth of development. -The language is precise and shows syntactic variety, and ideas are clearlycommunicated to the reader.

Analytic Rubric for Peer Assessment of Team Project MembersBelow Expectation Good Exceptional

Project Made few substantive Contributed a "fair ContributedContributions contributions to the share" of substance to considerable

team's final product the team's final product substance to theteam's final product:.

Leadership Rarely or never Accepted a "fair share" Routinely providedexercised leadership of leadership excellent leadership

responsibilitiesCollaboration Undermined group Respected other's Respected other's

discussions or often opinions-and contributed opinions and madefailed to participate to the group's discussion major contributions to

the group's discussion

Appendix 3.1 25

Page 29: Student Learning Outcomes—A Focus on Results

Online RubricsFor links to online rubrics, go to http://www.calstate.edu/acadaff/sloa/. Many rubrics havebeen created for use in K-12 education, and they can be adapted for higher education. It's ofteneasier to adapt a rubric that has already been created than to start from scratch.

Rubrics have many strengths: Complex products or behaviors can be examined efficiently. Developing a rubric helps to precisely define faculty expectations. Rubrics are criterion-referenced, rather than norm-referenced. Raters ask, "Did the student

meet the criteria for level 5 of the scoring rubric?" rather than "How well did this student do compared to other students?"

Ratings can be done by students to assess their own work, or they can be done by others, e.g., peers, fieldwork supervisions, or faculty.

Rubrics can be useful for grading, as well as assessment.

Analytic Rubric for Grading Oral PresentationsBelow Expectation Satisfactory Exemplary Score

Organization

No apparent The presentation has a The presentation is

organization. focus and provides carefully organizedEvidence is not used some evidence which and providesto support assertions. supports conclusions. convincing evidence

to supportconclusions.

(0-2) (3-5) (6-8)Content The content is The content is The content is

inaccurate or overly generally accurate, but accurate andgeneral. Listeners are incomplete. Listeners complete. Listeners

unlikely to learn may learn some are likely to gain newanything or may be isolated facts, but they insights about the

misled. are unlikely to gain topic.new insights about the

topic.(0-2) (5-7) (10-13)

Style The speaker appears The speaker is The speaker is relaxedanxious and generally relaxed and and comfortable,

uncomfortable; and comfortable, but too speaks without unduereads notes, rather often relies on notes. reliance on notes, and

than speaks. Listeners are interacts effectivelyListeners are largely sometimes ignored or with listeners.

ignored. misunderstood.(0-2) (3-6) (7-9)

Total Score

Appendix 3.2 26

Page 30: Student Learning Outcomes—A Focus on Results

Suggestions for Using Rubrics in Courses 1. Hand out the grading rubric with the assignment so students will know your expectations

and how they'll be graded. This should help students master your learning objectives by guiding their, work in appropriate directions.

2. Use a rubric for grading student work and return the rubric with the grading on it., Faculty save time writing extensive comments; they just circle or highlight relevant segments of therubric. Some faculty include room for additional comments on the rubric page; either within. -each section or at the end.

3. Develop a rubric with your students for an assignment or group project. Students can then monitor themselves and their peers using agreed-upon criteria that they helped develop: Many faculty find that students will create higher standards for themselves than faculty would impose on them.

4. Have students apply your rubric to some sample products before they create their own: Faculty report that students are quite accurate when doing this, and this process should help

them evaluate their own products as they are being developed. The ability to evaluate, edit, and improve draft documents is an important skill.

5. Have students exchange paper drafts and give peer feedback using the rubric, then give students a few days before the final drafts are turned in to you. You might also require that they turn in the draft and scored rubric with their final paper.

6. Have students self-assess their products using the grading rubric and hand-in the self- assessment with the product; then faculty and students can compare self- and faculty generated evaluations.

Sometimes a generic rubric can be used, and it can be refined as raters become more-experienced or as problems emerge.

Generic Rubric for Assessing PortfoliosUnacceptable: Marginal: Acceptable: Exceptional:Evidence that the Evidence that the Evidence shows Evidencestudent has student has that the student demonstrates thatmastered this mastered this has generally the student hasobjective is not objective is attained .this mastered thisprovided, provided, but it is objective. objective at aunconvincing, or weak or high level.very incomplete. incomplete.

LearningObjective ILearningObjective 2LearningObjective 3

Appendix 3.3 27

Page 31: Student Learning Outcomes—A Focus on Results

Steps for Creating a Rubric1. Identify what you are assessing; e.g.; critical thinking.2: Identify the characteristics of what you are assessing, e.g., appropriate use of evidence,

recognition of logical fallacies.3. Describe the best work you could expect using these characteristics. This describes the top category.4. Describe the worst acceptable product using these characteristics. This describes the lowest

acceptable category.5. Describe an unacceptable product. This describes the lowest category.6. Develop descriptions of intermediate-level products and assign them to intermediate categories. You

might decide to develop a scale with five levels (e.g., unacceptable, marginal, acceptable, competent, outstanding), three levels (e.g., novice, competent,exemplary), or any other set that is meaningful.

7. Ask colleagues who were not involved in the rubric's development to apply it to someproducts or behaviors and revise as needed to eliminate ambiguities.

Group ReadingsRubrics can be applied by one person, but group readings can be very effective because they bring faculty together to analyze and discuss student learning. If data are aggregated as results come in, the group reading can end with a discussion of what the results mean, who needs to know the results; what responses might be reasonable (e.g., curricula, pedagogy, or support changes), and how the assessment process, itself, could be improved.Who should be invited to group readings?Faculty and others (e.g., graduate students, fieldwork supervisors, community professionals),especially those who control and offer the curriculum and who can make valid, informedjudgments about student learning.

Managing Group Readings1. If the reliability of the rubric is known to be high, it may be reasonable to have only one reader

analyze each document, but it generally is preferable to use two readers so that inter-rater reliability can be examined and discrepancies can be identified and resolved.

2. When two readers work independently, the second reader may be allowed to peek at the first rater's judgments. Readers often are curious about other's opinions, and no harm is done if the first rater's scores are hidden until after the second opinions have been recorded.

3. Sometimes, results are monitored as they are turned in, and documents are given to a third reader when necessary to resolve discrepancies. For example, the facilitator may send any document that has a scorer difference of more than one point to a third reader who determines which rating is more accurate.

4. Sometimes readers work in pairs, independently rating each document, then jointly resolving all disagreements. They may be asked to discuss only the ratings that differ by some amount, such as, at least two units.

5. When two rates disagree, faculty must decide which rating will be used in the analysis, or they may decide to use both. Whatever the decision, the project report should document how data were generated.

Appendix 3.4 28

Page 32: Student Learning Outcomes—A Focus on Results

Scoring Rubric Group Orientation and Calibration

1. Describe the purpose for the review, stressing how it fits into program assessment plans. Explain that the purpose is to assess the program, not individual students or faculty; and describe ethical guidelines, including respect for confidentiality and privacy:

2. Describe the nature of the products that will be reviewed, briefly summarizing how they were obtained.

3. Describe the scoring rubric and its categories. Explain how it was developed.4. Explain that readers should rate each dimension of an analytic rubric separately, and they should

apply the criteria without concern for how often each category is used.5. Give each reviewer a copy of several student products that are exemplars of different levels of

performance. Include, if possible, a weak product, an intermediate-level product, a strong product, and a product that appears to be particularly difficult to judge. Ask each volunteer to independently apply the rubric to each of these products, and show them how to record their ratings.

6. Once everyone is done, collect everyone's ratings and display them so everyone can see the degree of agreement. This is often done on a blackboard, with each person in turnannouncing his/her ratings as they are entered on the board. Alternatively, the facilitatorcould ask raters to raise their hands when their rating category is announced, making theextent of agreement very clear to everyone and making it very easy to identify raters who routinely give unusually high or low ratings.

7. Guide the group in a discussion of their ratings. There will be differences, and this discussion is important to establish standards. Attempt to reach consensus on the most appropriate rating for each of the products being examined by inviting people who gave different ratings to explain their judgments. Usually consensus is possible, but sometimes a split decision is developed, e.g., the group may agree that a product is a “3-4" split because it has elements of both categories. Expect more discussion time if you include a hard-to-score example; but be aware that its inclusion will save everyone grief later because such documents are bound to occur. You might allow the group to revise the rubric to clarify its use, but avoid allowing the group to drift away from the learning objective being assessed.

8. Once the group is comfortable with the recording form and the rubric, distribute the products and begin the data collection.

9. If you accumulate data as they come in and can easily present a summary to the group at the end of the reading, you might end the meeting with a discussion of four questions:

a. What do the results mean?b. Who needs to know the results?c. What are the implications of the results for curriculum, pedagogy, or student support

services?d. How might the assessment process, itself, be improved?

10. It can be useful to set up a spreadsheet to calculate means, frequencies, and reliability. Then discuss what the scores mean for your curriculum, teaching methods, students, etc. Report the inter-rater reliability: % + or – 1 point, % + or – 2 points, etc.

Appendix 3.5 29

Page 33: Student Learning Outcomes—A Focus on Results

Appendix 4 – Examples of Scoring Rubrics

“Map Rubric” is a scoring tool for the Online Map Creation web site (www.aquarius.geomar.de/omc).

“Grading Standards: Written Work for ‘The Living Environment’ BIOL 111” is a rubric for writing in Biology (A-F scales with definitions) at Southern Illinois University, Edwardsville (http://www.siue.edu/~deder/grstand.html)

“Student Participation Assessment and Evaluation” is a rubric with 4-point scales: frequently, occasionally, seldom, almost never, used at Southern Illinois University, Edwardsville (http://www.siue.edu/~deder/partrub.html)

“Assessing Modeling Projects in Calculus and Precalculus” by C. E. Emenaker of the University of Cincinnati gives a math project problem with two scoring rubrics (http://www.maa.org/saum/maanotes49/116.html)

“Scientific Report Rubric” and “Collaboration Rubric” developed for the Cabrillo Tidepool Study (http://edweb.sdsu.edu/triton/tidepoolunit/Rubrics/reportrubric.html and http://edweb.sdsu.edu/triton/tidepoolunit/Rubrics/collrubric.html)

“Rubric For Evaluating Web Sites” originally developed by John Pilgrim, Horace Mann Academic Middle School, San Francisco (http://edtech.sandi.net/rubric/)

“Secondary Assessment Tools” is a web site with links to several dozen simple Performance Assessment rubrics (http://www.bcps.org/offices/lis/models/tips/assess_sec.html)

“Rubric for E-Zine Pursuit” is a rubric for evaluating an electronic, web-based magazine or journal (http://www.esc20.net/etprojects/formats/webquests/summer99/northside/ezine/rubric.html)

“Course Embedded Assessment” by Larry Kelley, Executive Director of Institutional Effectiveness & Planning at University of Louisiana Monroe, [email protected] . This workshop presentation material describes the process of embedding program assessment in courses, gives outlines of several program assessment plans, describes the use of rubrics, and gives examples of rubrics for Written Communication Skills, Oral Communication Skills, Problem Solving Skills, and Basic Information Technology Skills.

“Student Learning Outcomes in the California State University” is a web site that gives links to about 50 scoring rubrics (http://www.calstate.edu/AcadAff/SLOA/links/rubrics.shtml) . An example is the Scoring Guide for the CSU English Placement Test (EPT), CSU Fresno rubrics on Critical Thinking, Integrative Science, and Writing.

Appendix 4 30

Page 34: Student Learning Outcomes—A Focus on Results

Appendix 5A – Activity #4: Building a RubricSelect or write an SLO, identify Primary Traits, and then decide on “observables” for each assessment level

SLO Statement:

Trait Excellent Satisfactory Unsatisfactory Score

Total:

Appendix 5 31

Page 35: Student Learning Outcomes—A Focus on Results

Appendix 5B – Scoring Rubric – English Department – Modesto Junior College

Excellent—Markedly Exceptional Superior—Clearly Above Average

Satisfactory—Fully Competent

Poor—Marginally Acceptable Failing—UnacceptableD

evel

opm

ent

A comprehensive grasp of the subject matter is demonstrated

Body is developed with original, insightful, and creative support; the paper goes beyond repeating what others have said and contributes something new to our understanding of the topic

Focus is clear, imaginative and fully realized

Demonstrates specific attention to relationship between audience and purpose

A thorough grasp of the subject matter is demonstrated

Focus is clear and thoughtful Body is generally supported by

facts, examples, etc. though support will not be as varied or vivid as in an excellent paper

Demonstrates understanding of audience and purpose, though may occasionally stray from it

A basic grasp of the subject matter is demonstrated

Focus is generally adequate but may not be immediately clear to all readers

Response to the assignment is generally adequate

Body supported by facts, examples, details, but are mainly surface oriented and generalized

Demonstrates only some understanding of audience and purpose

A lack of familiarity with the subject matter is demonstrated

Focus is vague, either too general, too narrow, superficial, or indirect

Body supported by few examples or facts; many examples are unanalyzed

Demonstrates poor understanding of audience and purpose

A basic lack of understanding of the subject matter is demonstrated

Focus is not evident Body largely

unsupported by relevant facts or examples

Demonstrates no understanding of audience/purpose

Org

aniz

atio

n

Clear, logical, and inventive organization of ideas in relation to one another and to the essay’s focus

Highly effective introduction and conclusion

Appropriate and smooth transitions between paragraphs

Clear and logical organization of ideas in relation to one another and to the focus

Appropriate introduction and conclusion

Appropriate and smooth transitions between paragraphs and between most sentences

Ideas generally related to one another and to the focus, but may have some unrelated material

Adequate introduction and conclusion

Some attempt at transitions

Unclear ordering of ideas; organization not readily apparent

Underdeveloped or inappropriate introduction and conclusion

Transitions are lacking

Minimal organization; inappropriate or no paragraphing

Ineffective or missing introduction and conclusion

Minimal or no use of transitions

Styl

e/V

oice

Engaging and individualized voice appropriate to the audience/purpose

Consistency of tone/voice Refreshing and revealing word choice Varied and interesting sentence

structure

Voice appropriate to the audience/purpose, though it may be somewhat generic or predictable in places

Consistency of tone/voice Interesting and varied word

choice Some creative sentence variety

Voice adequate to audience/purpose, but often is predictable

May be slight inconsistencies of tone

Predictable word choice; low range of synonyms employed

Sentences mechanically sound but lack in variety

Voice generally hard to characterize because of frequent mechanical problems

Phrasing problems, garbled sentence structure noticeable in several places

Overall lack of control/confidence of writing voice

Voice/style not possible due to severe mechanical problems

Mec

hani

cs

Full variety of sentence structures used correctly

Accurate and precise diction and phrasing

Very few grammatical and punctuation errors

Variety of sentence structure used correctly

Accurate diction and phrasing Infrequent grammatical and

mechanical errors that rarely disrupt flow or clarity

Frequent sentence structure problems

Diction/phrasing often inaccurate

Frequent and varied grammatical, punctuation, and mechanical errors that interfere with clarity

Sentences often simplistic or incoherent

Frequent misuse of common words and phrases

Many major grammatical, punctuation, and mechanical errors that interfere with some reader’s understanding of the text

Simplistic or incoherent sentences outweigh intelligible sentences

Diction often inaccurate or severely limited vocabulary

Mechanical errors predominate

Appendix 6 32

Page 36: Student Learning Outcomes—A Focus on Results

Activity #5C: Using a Grading Rubric and Norming the ResultsUse the rubric in Appendix 5B to evaluate this sample student writing assignment.

Compare your evaluation with that of a colleague.

English 101: Freshman Composition 

Essay #2: Taking a Stand: High School Exit ExamsDue date for peer edit: 10/13 (must be at least 3 pages and typed)

Final draft due 10/15 (Please turn in with rough draft and editing sheet) Your Task:Using the articles and editorials provided to you as resources, write a persuasive argument either for or against the exams. Paper must:-Have a clear, underlined thesis explaining which side you support-Explain/summarize the issue at hand-Give at least two reasons for your stance-Explain at least one aspect of the other side’s argument-Use at least two quotes from provided sources Remember:-Use MLA format-Cite all quotes, ideas that are not your own, and statistics-Include Works Cited list at the end of essay You may also consider:-Your own personal experience or the experience of someone you know who is in high school/works at a high school                     Appendix 6 33

Page 37: Student Learning Outcomes—A Focus on Results

English 101   10-13-03 

Essay #2: Taking a Stand: High School Exit Exam

High school is stressful considering issue such as: peer pressure, the struggle of passing classes,

and trying to maintain a high Grade Point Average. Most students are desperately trying to keep

themselves a float in this society of raging waters. They feel they cannot handle anything else. For many

of them can hardly carry what they have already. Now students have one more burden to carry and that is

the high school exit exam.

Learning contains many key principles, but the most basic of all is desire. The students have to

have a passion to learn. Many argue this exam hurts the “underprivileged” such as minorities and low-

income families, but is this true (Burke, “Exit Exam” B5). The greatest hindrance that keeps most

students from learning is problems of drugs, alcohol, and domestic violence. These problems are found

both in the homes of the rich, the poor and almost of any ethnic background. There isn’t any good reason

why a student, that doesn’t have any disabilities or language barriers, should have problems learning.

Students that have such problems concerning language barriers and disabilities should be provided

programs that will steadily prepare them for the exit exam. High School students should be made to take

the exit exam to make sure they are progressing, and that they have basic skills to survive life, to get good

jobs or to pursue careers. There shouldn’t be a student left behind, not knowing their basic skills of

reading, writing, and math.

During high school, the greatest amount of progression should be made above any other time in

grade school, and this can only be done with the help of our school system. What is done between grades

9th to 12th does matter, for whatever they learn between these grades they will probably carry with them

for the rest of their lives. The teachers should help the students progress by fully explaining what goals

they want the students to meet. They should push the student to think: always getting them involved in

Appendix 6 34

Page 38: Student Learning Outcomes—A Focus on Results

every class project and discussion. There needs to be an interaction between teachers and students. The

class should never look bored and stagnant. There is a great need for open communication between

teachers and students. Students should be able to come to the teacher if they have any trouble with the

assignment or any other issues pertaining to any of their educational needs.

If they are planning on giving an exam; that test high school students abilities; the schools should

fully prepare the teachers and the students. Teachers should be made to teach all the materials that will be

on the exams year around for the full four years of high school. Students should be tested every year, so

they can see where they need to progress for the upcoming year. This will be helpful to both the teacher

and the student. CNN Student News, center director Jack Jennings said, “You have to provide a system to

help kids succeed…These test are a good idea if they’re done right (Jennings qtd. In “States stick”). “We

cannot just drop an exam on student’s laps and expect that they take it if we don’t fully prepare them. No

part of the exam should be a mystery to them; it should all be review. Students, on the other hand, should

be made accountable for what they learn. They should study often. This exam is supposed to test what

they have learned during these past four years of their lives. If we go about this the right way, this exam

should be like any other test for the student.

This exam should be taken so that the student will have the basic skills to survive life. Everyday,

if we realize it or not, we are surrounded by writing, reading, and mathematics. For example, anytime we

go to the store we use math, whether it is for calculating 30% off of item on sale or giving and receiving

money from the cahier. Another example is the ability to read or write, and its important usage for the

voter in an election. Its importance is beyond our reasoning, for we really have to know what we are

reading, when it has to do with drafting in different laws. Everyday we are surrounded by these

obscurities that call for basic skills, skills that may look non useful, but one-day students will need.

Once students graduate from high school, that’s when life really begins. They will most likely use

all they learned in high school, in college and even after that in the work place. All students will need

these basic skills of reading, writing, and math in their jobs and also in whatever career they decide to

Appendix 6 35

Page 39: Student Learning Outcomes—A Focus on Results

pursue. The whole point of the exam is to encourage students to progress, so they won’t feel lost and

confused, when they graduate and try to find a job or seek a profession.

The high school exit exam shouldn’t even be a debate, if its just basic material that high school

students should already know. “David Cooper, director of secondary education for Modesto City

Schools, said students may take the test up to eight times, and most will eventually pass (qtd. In

Herendeen, “Students Cheer” A1).” Students shouldn’t “eventually” understand the material; they should

know the material (qtd. In Herendeen, “Students Cheer: A1).” The reason why taking the high school

exit exam is an issue is because they don’t already know the basic material, which will be sooner or later

in life, be put before them. We need to go back to the basics, and make sure that math, reading, and

writing are being taught before any other materials. These basics need to be priority, and any other extra

curricular subject, secondary. The only way we can make sure students are being taught, is to test their

abilities. We need to strive together as a people and make sure students are learning. We want students to

leave high school knowing they have progressed, that they have learned something of great value. They

should feel confident when they get out of high school. They should have the ability and opportunity to

survive in life, get a good job, and pursue the career of their dreams. It is our responsibility to make sure

they have their feet planted on solid ground, ready to go out in this world and make a difference.

 

Works Cited

Burke, Frank. Letter. The Modesto Bee 28 June 2003: B5

Herendeen Susan. Letter. The Modesto Bee 10 July 2003: A1

“States stick with high-school exit exam.” CNN Student News 20 Aug. 2003. 12. Oct.

2003

<http://www.cnn.com/2003/EDUCATION/08/13/high.school.exams.ap

 

Appendix 6 36

Page 40: Student Learning Outcomes—A Focus on Results

Appendix 6 – The Case for Authentic Assessment by Grant Wiggins (http://www.ericfacility.net/databases/ERIC_Digests/ed328611.html )

WHAT IS AUTHENTIC ASSESSMENT? Assessment is authentic when we directly examine student performance on worthy intellectual tasks.

Traditional assessment, by contract, relies on indirect or proxy 'items'--efficient, simplistic substitutes from which we think valid inferences can be made about the student's performance at those valued challenges.

Do we want to evaluate student problem-posing and problem-solving in mathematics? experimental research in science? speaking, listening, and facilitating a discussion? doing document-based historical inquiry? thoroughly revising a piece of imaginative writing until it "works" for the reader? Then let our assessment be built out of such exemplary intellectual challenges. Further comparisons with traditional standardized tests will help to clarify what "authenticity" means when considering assessment design and use:

Authentic assessments require students to be effective performers with acquired knowledge. Traditional tests tend to reveal only whether the student can recognize, recall or "plug in" what was learned out of context. This may be as problematic as inferring driving or teaching ability from written tests alone. (Note, therefore, that the debate is not "either-or": there may well be virtue in an array of local and state assessment instruments as befits the purpose of the measurement.)

Authentic assessments present the student with the full array of tasks that mirror the priorities and challenges found in the best instructional activities: conducting research; writing, revising and discussing papers; providing an engaging oral analysis of a recent political event; collaborating with others on a debate, etc. Conventional tests are usually limited to paper-and-pencil, one- answer questions.

Authentic assessments attend to whether the student can craft polished, thorough and justifiable answers, performances or products. Conventional tests typically only ask the student to select or write correct responses--irrespective of reasons. (There is rarely an adequate opportunity to plan, revise and substantiate responses on typical tests, even when there are open-ended questions). As a result,

Authentic assessment achieves validity and reliability by emphasizing and standardizing the appropriate criteria for scoring such (varied) products; traditional testing standardizes objective "items" and, hence, the (one) right answer for each.

"Test validity" should depend in part upon whether the test simulates real-world "tests" of ability. Validity on most multiple-choice tests is determined merely by matching items to the curriculum content (or through sophisticated correlations with other test results).

Authentic tasks involve "ill-structured" challenges and roles that help students rehearse for the complex ambiguities of the "game" of adult and professional life. Traditional tests are more like drills, assessing static and too-often arbitrarily discrete or simplistic elements of those activities. Beyond these technical considerations the move to reform assessment is based upon the premise that

assessment should primarily support the needs of learners. Thus, secretive tests composed of proxy items and scores that have no obvious meaning or usefulness undermine teachers' ability to improve instruction and students' ability to improve their performance. We rehearse for and teach to authentic tests--think of music and military training--without compromising validity.

The best tests always teach students and teachers alike the kind of work that most matters; they are enabling and forward-looking, not just reflective of prior teaching. In many colleges and all professional settings the essential challenges are known in advance--the upcoming report, recital, Board presentation, legal case, book to write, etc. Traditional tests, by requiring complete secrecy for their validity, make it difficult for teachers and students to rehearse and gain the confidence that comes from knowing their performance obligations. (A known challenge also makes it possible to hold all students to higher standards). WHY DO WE NEED TO INVEST IN THESE LABOR-INTENSIVE FORMS OF ASSESSMENT?

While multiple-choice tests can be valid indicators or predictors of academic performance, too often our tests mislead students and teachers about the kinds of work that should be mastered. Norms are not standards; items are not real problems; right answers are not rationales.

What most defenders of traditional tests fail to see is that it is the form, not the content of the test that is harmful to learning; demonstrations of the technical validity of standardized tests should not be the issue in the assessment reform debate. Students come to believe that learning is cramming; teachers come to believe that tests are after-the- fact, imposed nuisances composed of contrived questions--irrelevant to their intent and success. Both parties are led to believe that right answers matter more than habits of mind and the justification of one's approach and results. Appendix 6 37

Page 41: Student Learning Outcomes—A Focus on Results

A move toward more authentic tasks and outcomes thus improves teaching and learning: students have greater clarity about their obligations (and are asked to master more engaging tasks), and teachers can come to believe that assessment results are both meaningful and useful for improving instruction.

If our aim is merely to monitor performance then conventional testing is probably adequate. If our aim is to improve performance across the board then the tests must be composed of exemplary tasks, criteria and standards. WON'T AUTHENTIC ASSESSMENT BE TOO EXPENSIVE AND TIME-CONSUMING?

The costs are deceptive: while the scoring of judgment-based tasks seems expensive when compared to multiple-choice tests (about $2 per student vs. 1 cent) the gains to teacher professional development, local assessing, and student learning are many. As states like California and New York have found (with their writing and hands-on science tests) significant improvements occur locally in the teaching and assessing of writing and science when teachers become involved and invested in the scoring process.

If costs prove prohibitive, sampling may well be the appropriate response--the strategy employed in California, Vermont and Connecticut in their new performance and portfolio assessment projects. Whether through a sampling of many writing genres, where each student gets one prompt only; or through sampling a small number of all student papers and school-wide portfolios; or through assessing only a small sample of students, valuable information is gained at a minimum cost. And what have we gained by failing to adequately assess all the capacities and outcomes we profess to value simply because it is time- consuming, expensive, or labor-intensive? Most other countries routinely ask students to respond orally and in writing on their major tests--the same countries that outperform us on international comparisons. Money, time and training are routinely set aside to insure that assessment is of high quality. They also correctly assume that high standards depend on the quality of day-to-day local assessment--further offsetting the apparent high cost of training teachers to score student work in regional or national assessments. WILL THE PUBLIC HAVE ANY FAITH IN THE OBJECTIVITY AND RELIABILITY OF JUDGMENT-BASED SCORES?

We forget that numerous state and national testing programs with a high degree of credibility and integrity have for many years operated using human judges:

the New York Regents exams, parts of which have included essay questions since their inception--and which are scored locally (while audited by the state);

the Advanced Placement program which uses open-ended questions and tasks, including not only essays on most tests but the performance-based tests in the Art Portfolio and Foreign Language exams;

state-wide writing assessments in two dozen states where model papers, training of readers, papers read "blind" and procedures to prevent bias and drift gain adequate reliability;

the National Assessment of Educational Progress (NAEP), the Congressionally-mandated assessment, uses numerous open-ended test questions and writing prompts (and successfully piloted a hands-on test of science performance);

newly-mandated performance-based and portfolio-based state-wide testing in Arizona, California, Connecticut, Kentucky, Maryland, and New York. Though the scoring of standardized tests is not subject to significant error, the procedure by which items are

chosen, and the manner in which norms or cut-scores are established is often quite subjective--and typically immune from public scrutiny and oversight.

Genuine accountability does not avoid human judgment. We monitor and improve judgment through training sessions, model performances used as exemplars, audit and oversight policies as well as through such basic procedures as having disinterested judges review student work "blind" to the name or experience of the student--as occurs routinely throughout the professional, athletic and artistic worlds in the judging of performance.

Authentic assessment also has the advantage of providing parents and community members with directly observable products and understandable evidence concerning their students' performance; the quality of student work is more discernible to laypersons than when we must rely on translations of talk about stanines and renorming.

Ultimately, as the researcher Lauren Resnick has put it, What you assess is what you get; if you don't test it you won't get it. To improve student performance we must recognize that essential intellectual abilities are falling through the cracks of conventional testing.

Appendix 6 38

Page 42: Student Learning Outcomes—A Focus on Results

Appendix 7 -- State and National Standards, Academic & Vocational Competencies

Board of Registered Nursing400 R St., Suite 4030

Sacramento, CA 94244Phone: 916.322.3350Fax: 916.327.4402

Email: [email protected] UR: www.rn.ca.gov

TITLE   16.   Professional   And   Vocational   Regulations §1443.5. Standards of Competent Performance(http://www.calnurse.org/cna/np/brn/standard.html) A registered nurse shall be considered to be competent when he/she consistently demonstrates the ability to transfer scientific knowledge from social, biological and physical sciences in applying the nursing process, as follows:

(1) Formulates a nursing diagnosis through observation of the client's physical condition and behavior, and through interpretation of information obtained from the client and others, including the health team.

(2) Formulates a care plan, in collaboration with the client, which ensures that direct and indirect nursing care services provide for the client's safety, comfort, hygiene, and protection, and for disease prevention and restorative measures. (3) Performs skills essential to the kind of nursing action to be taken, explains the health treatment to the client and family and teaches the client and family how to care for the client's health needs. (4) Delegates tasks to subordinates based on the legal scopes of practice of the subordinates and on the preparation and capability needed in the tasks to be delegated, and effectively supervises nursing care being given by subordinates. (5) Evaluates the effectiveness of the care plan through observation of the client's physical condition and behavior, signs and symptoms of illness, and reactions to treatment and through communication with the client and health team members, and modifies the plan as needed. Appendix 7

American Welding Society

550 NW LeJeune RoadMiami, FL  33126

Phone: (800) 443-9353Fax: (305) 443-7559

Email: [email protected]: www.aws.org

Welding Codes & Standards (www.aws.org/cgi-bin/shop)AWS is recognized worldwide for the development of consensus-based American National Standards. Over 170 standards -- as codes, recommended practices, guides and specifications. Certification is offered in seven different welding processes.

Shielded Metal Arc Welding (SMAW)Gas Metal Arc Welding (GMAW)

Gas Metal Arc Welding - Short Circuit (GMAW-S)Flux Cored Arc Welding (FCAW)

Gas Tungsten Arc Welding (GTAW)Submerged Arc Welding (SAW)

Brazing

National Business Education Association1914 Association Drive

Reston, VA 20191Phone: 703-860-8300Fax: 703-620-4483

Email: [email protected]: www.nbea.ort

Business Education Standards (www.nbea.org/curfbes.html) Using the concepts described in these standards, business teachers introduce students to the basics of personal finance, the decision-making techniques needed to be wise consumers, the economic principles of an increasingly international marketplace, and the processes by which businesses operate. In addition, these standards provide a solid educational foundation for students who want to successfully complete college programs in various business disciplines.

AccountingBusiness Law

Career DevelopmentCommunication

ComputationEconomics & Personal Finance

EntrepreneurshipInformation TechnologyInternational Business

ManagementMarketing

39

Page 43: Student Learning Outcomes—A Focus on Results

(6) Acts as the client's advocate, as circumstances require, by initiating action to improve health care or to change decisions or activities which are against the interests or wishes of the client, and by giving the client the opportunity to make informed decisions about health care before it is provided.

American Psychological Association

750 First Street, NE, Washington, DC 20002Phone: 800-374-2721Fax: 202-336-6123

Email: [email protected] URL: www.apa.org

Knowledge, Skills, and Values Consistent with the Science and Application of Psychology and with Liberal Arts Education that are Further Developed in Psychology.(www.apa.org/ed/pcue/taskforcereport2.pdf) In this document we provide details for 10 suggested goals and related learning outcomes for the undergraduate psychology major. These Undergraduate Psychology Learning Goals andOutcomes represent what the Task Force considers to be reasonable departmental expectations for the psychology major in United States' institutions of higher education.

Goal 1. Knowledge Base of PsychologyStudents will demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings, and historical trends in psychology. Goal 2. Research Methods in PsychologyStudents will understand and apply basic research methods in psychology, including research design, data analysis, and interpretation. Goal 3. Critical Thinking Skills in PsychologyStudents will respect and use critical and creative thinking, skeptical inquiry, and, when possible, the scientific approach to solve problems related to behavior and mental processes. Goal 4. Application of PsychologyStudents will understand and apply psychological principles to personal, social, and organizational issues. Goal 5. Values in PsychologyStudents will be able to weigh evidence, tolerate ambiguity, act ethically, and reflect other values that are the underpinnings of psychology as a discipline. Goal 6. Information and Technological LiteracyStudents will demonstrate information competence and the ability to use computers and other technology for many purposes. Goal 7. Communication SkillsStudents will be able to communicate effectively in a variety of formats.

Appendix 7 40

Page 44: Student Learning Outcomes—A Focus on Results

Goal 8. Sociocultural and International AwarenessStudents will recognize, understand, and respect the complexity of sociocultural andinternational diversity. Goal 9. Personal DevelopmentStudents will develop insight into their own and others’ behavior and mental processes and apply effective strategies for self-management and self-improvement. Goal 10. Career Planning and DevelopmentStudents will emerge from the major with realistic ideas about how to implement their psychological knowledge, skills, and values in occupational pursuits in a variety of settings.

Association of College and Research Libraries

1914 Association DriveReston, VA 20191

Phone: 703-860-8300Fax: 703-620-4483

Email: [email protected] URL: www.acrl.org

Information Literacy Standards (www.acrl.org/infolit) Information literacy is defined and its relationship to technology, higher education, and pedagogy is discussed. Each of the five standards come with detailed performance indicators.

Standard One. The information literate student determines the nature and extent of the information needed.

Standard Two. The information literate student accesses needed information effectively and efficiently.

Standard Three. The information literate student evaluates information and its sources critically and incorporates selected information into his or her knowledge base and value system.

Standard Four. The information literate student, individually or as a member of a group, uses information effectively to accomplish a specific purpose.

Standard Five. The information literate student understands many of the economic, legal, andsocial issues surrounding the use of information and accesses and uses informationethically and legally.

Appendix 7 41

Page 45: Student Learning Outcomes—A Focus on Results

A Hierarchy of Postsecondary Outcomes from “Defining and Assessing Learning: Exploring Competency-Based Initiatives” a report of the National Postsecondary Education Cooperative Working Group on Competency-Based Initiatives in Postsecondary Education published by the National Center for Educational Statistics, September 2002 (http://nces.ed.gov/pubs2002/2002159.pdf )

Appendix 7 42

Page 46: Student Learning Outcomes—A Focus on Results

Appendix 8 – Assessment Report Example #1xiv

SLO Results – English 371 – Literature & the Visual Arts – Raymond Walters CollegeCourse Objective:Compare and contrast the text and film versions of a literary work.

Trait 4 points 3 points 2 points 1 pointPlot Accurate plot

reviewAccurate plot review

Minor inaccuracies of plot

Glaring plot inaccuracies

Desired SLO:After viewing an assigned film based on a literary text, write a review of the film. Include an appraisal of the director’s selection and effective translation of content from the literary text and the dominant tone the director seems to be trying to achieve, supporting each statement with detail from the text and film and your personal reaction to the cited scenes.

Text Analysis

Analysis of text beyond literal interpretation

Analysis of text beyond literal interpretation

Analysis of text includes literal interpretation

Literal analysis

Supporting Statements

Support with specific details from text/film

Weak support with specific details from film

Few specific details as support

No specific details as support

Personal Reactions

Personal evaluation based on analysis

Personal evaluation not based on analysis

Little personal evaluation

No personal evaluation

Number of Students Scoring at Each Point Level by Film Number Reviewed

Film #1 10 7 3 1Film #2 11 7 3 0Film #3 10 8 3 1Film #4 12 5 5 1Film #5 13 5 3 0Film #6 6 8 6 1Film #7 9 7 7 5

Instructor Analysis: I handed out the trait scale to students on the first day of class, but I am not sure they consulted it; upon my inquiring whether they had a copy near the end of the course, few students were able to locate it in their notebooks. This taught me that I should refer to the scale more explicitly in class. I anticipated that it would be easy for students to give an analysis but difficult for them to identify concrete support for their ideas. However, I discovered that students found it easier to point to specific places in the movies that gave them ideas than to articulate those ideas. Therefore, I will revise the scale for the next course to reflect the relative challenges of these skills.

Assessment Report Example #2xiv

PARKLAND COLLEGE ACADEMIC ASSESSMENT (Excerpts)DEPARTMENT: Fine and Applied Arts PROGRAM: Mass CommunicationMethods:

Indirect Assessment Measures

Intended Outcomes(Objectives)

Assessment Criteria & Methods

(Expected Results)

Actual Results Analysis & Action

1. Students will demonstrate proficiency in employable Mass Communication skills.

1. Students will demonstrate desired mass communication competencies as shown by individual portfolios, when assessed by representatives from industry, as reported on MC Portfolio Evaluation form.

1. Written comments from industry representatives indicate that MC student's portfolio assessment ranked 4 (on a scale of 1 to 5-five being the highest score). Suggestions were to include more Web site graphics into curriculum.

1. Desktop Graphics program revised to include more experience in Web site graphics. Students designed graphics for current MC home page and links.

2. Students will demonstrate learning the basic concepts necessary to perform satisfactorily in Mass Communications entry-level jobs.

2. When surveyed using Parkland College Student Occupational Follow-Up Survey, graduates will describe satisfaction with their Mass Communication knowledge to recall, analyze, evaluate, and utilize basic concepts.

2. Feedback from employers and students strongly indicated that Visual Arts program option had become obsolete; preference is given to graduates with Desktop Publishing skills.

2. Visual Arts program option shelved.

3. Students in the Mass Communication A.A. program will have the knowledge to successfully complete a Bachelors degree in Mass Communication.

3. Four-year institutions will report of a 75 percent acceptance rate into Mass Communication programs.

3. U of I Coordinator of Transfer Articulation reported that out of 29 applicants from other schools to Graphics a Mass Com student was the only admit.

3. Continue to gather/monitor data. Investigate how many Parkland Graphics students applied.

Appendix 8 43

Pre/Post Tests Capstone exam/project Primary Trait Analysis Course Embedded Test Standardized Exams Professional Certification Portfolios Performance Assessment Other

Transfer/Employment Data Grad Surveys/Interviews Employer/Faculty Surveys

Page 47: Student Learning Outcomes—A Focus on Results

Assessment Report Example #3

Mesa Community College – Results from Student Learning Outcomes Assessment – Spring 2002 and 2003Outcome Statements Results

Com

mun

icat

ion

1. Write a clear, well-organized paper using documentation and quantitative tools when appropriate.

2. Construct and deliver a clear, well-organized, verbal presentation.

Written: The mean score for the post-group was significantly higher overall and on the scales for content, organization and mechanics/style. When each skill is considered separately, students showed relative strength in stating their own position, addressing the prompt, using appropriate voice and style and sentence structure. Students have consistently rated below the overall average on acknowledging the opposing position, developing each point with appropriate detail and commentary, progressing logically and smoothly, and using transitions and orienting statements.Oral: Significant differences between beginning students and completing students were shown in the total percentage correct for the assessment overall and for each of the subscales: knowledge about effective interpersonal interchanges, small group interaction and conducting oral presentations.

Num

erac

y

1. Identify and extract relevant data from given mathematical situations.

2. Select known models or develop appropriate models that organize the data into tables or spreadsheets, graphical representations, symbolic/ equation format.

3. Obtain correct mathematical results and state those results with the qualifiers.

4. Use the results.

The average percent correct was significantly higher for the post-group overall and for outcomes related to identifying and extracting relevant data, using models to organize data, obtaining results, and stating results with qualifiers. Patterns of performance have remained consistent over several years. Use of models is the strongest area and use of results is the weakest area.

Scie

ntifi

c In

quiry

Demonstrate scientific inquiry skills related to:1. Hypothesis: Distinguish between possible and

improbable or impossible reasons for a problem.2. Prediction: Distinguish between predictions that

are logical or not logical based upon a problem presented.

3. Assumption: Recognize justifiable and necessary assumptions based on information presented.

4. Interpretation: Weigh evidence and decide if generalizations or conclusions based upon given data are warranted.

5. Evaluation: Distinguish between probable and improbable causes, possible and impossible reasons, and effective and ineffective action based on information presented.

There was no significant difference in the average percent correct between groups in the 2002 administration; however, significant differences were noted, overall, in prior years. Students have been most successful in recognizing possible reasons for a problem. Making a conclusion based upon information presented has had the lowest percent correct for the past three years of administration.

Prob

lem

Sol

ving

/ C

ritic

al T

hink

ing

1. Identify a problem or argument.2. Isolate facts related to the problem.3. Differentiate facts from opinions or emotional

responses.4. Ascertain the author's conclusion.5. Generate multiple solutions to the problem.6. Predict consequences.7. Use evidence or sound reasoning to justify a

position.

The average total score was significantly higher for the post-group (completing), overall and for two sub-scales: Interpretation and Evaluation of Arguments. The post-group score was at the 45th percentile when compared to a national sample. Average student scores have been consistently highest for the Interpretation and Evaluation of Arguments sections and lowest for Inference.

Appendix 8 44

Page 48: Student Learning Outcomes—A Focus on Results

Mesa Community College – Results from Student Learning Outcomes Assessment – Spring 2002 and 2003Outcome Statements Results

Arts

& H

uman

ities

1. Demonstrate knowledge of human creations.2. Demonstrate awareness that different contexts

and/or worldviews produce different human creations.

3. Demonstrate an understanding and awareness of the impact that a piece (artifact) has on the relationship and perspective of the audience.

4. Demonstrate an ability to evaluate human creations.

Significant differences were observed overall and in three of four outcome areas: Demonstrate an awareness that different contexts and/or world views produce different human creations; an understanding and awareness of the impact that a piece has on the relationship and perspective of the audience; an ability to evaluate human creations.

Info

rmat

ion

Lite

racy

1. Given a problem, define specific information needed to solve the problem or answer the question.

2. Locate appropriate and relevant information to match informational needs.

3. Identify and use appropriate print and/or electronic information sources.

4. Evaluate information for currency, relevancy, and reliability.

5. Use information effectively.

The percent correct was significantly higher for the post-group overall and for three of five outcome areas: evaluating currency and relevance of information, identifying sources, and locating information. Students were most successful in evaluating information for currency and relevance, followed by defining information needed to solve a problem and identifying appropriate sources. Locating information was relatively more difficult. Students were least successful in using information effectively.

Cul

tura

l Div

ersi

ty

1. Identify and explain diverse cultural customs, beliefs, traditions, and lifestyles.

2. Identify and explain major cultural, historical and geographical issues that shape our perceptions.

3. Identify and explain social forces that can effect cultural change.

4. Identify biases, assumptions, and prejudices in multicultural interactions.

5. Identify ideologies, practices, and contributions that persons of diverse backgrounds bring to our multicultural world.

Students in the completing (post) group had significantly higher scores on direct measures of knowledge and on several diversity and democracy outcomes in both years. Completing students agreed more often that they have an “obligation to give back to the community.” In the most recent administration completing students rated themselves more highly than beginning students on having a pluralistic orientation, being able to see both sides of an issue and their own knowledge of cultures. Further, they agreed more strongly with statements that support the value of diversity, reflect tolerance for differences related to gender, and indicate that they engage in social action more often.

Appendix 8 45

Page 49: Student Learning Outcomes—A Focus on Results

Parkland College Academic Program AssessmentProgram: Computer Information Systems: Microcomputer Support Specialist/ Programming SpecializationAssessment Direct Assessment MeasuresMethods:

Indirect Assessment Measures

Intended Outcome(s):1. Graduates from this program will have acquired knowledge and skills needed for entry-level positions in a variety of computer-related fields.

Assessment Criteria:1.a. When surveyed, employers of our interns will rate 80% of the students with an average of 4.5 on a scale of1-5. The rating will be composed of 14 skill areas each rated on a scale of 1-5.

Results:1.a.1. Fall 2000:Two students fell under the 4.5 rating. 80% of the interns received an average score of 4.5 or higher. The weakest area was identified as "Ability to Plan," which received an average score of 4.29.

1.a.2. Spring 2002Five students took CIS 298: CIS Work Experiences in Spring 2002. Employers for all 5 returned surveys.

Analysis and Action:1.a.1. Fall 2000 data analyzed in Spring 2001:This indirect measure is not providing the results anticipated. The committee proposes making changes to the survey to make it a more valuable assessment tool. In addition, information will be given to the instructors in CIS 297-CIS seminar and CIS 231- Systems Analysis, Design and Administration to enhance course content to encourage students to strengthen their "ability to plan." A direct measure to show "ability to plan" will be included in the capstone tests given near the completion of the program. (See 1.c.)1.a.2. Spring 2002Students did well overall in every area. The lowest marks came in the "ability to plan" area with 1- Excellent, 4- Good ratings. Suggestions have been made for providing additional information in CIS 297: Seminar and CIS 231:Systems Analysis, Design and Administration.

Intended Outcome(s):1. (continued)

Assessment Criteria:1.d. 90% of students will score 80% or higher on a standard, capstone test to be administered near to their completion of program.

Results:1.d.1. Fall 1999: The percentage of those students giving the right answers ranged from 13% on the question that the fewest answered correctly to 87% on the question answered correctly by the most students.

Analysis and Action:1.d.1. Fall 1999 data analyzed in Spring 2000:Faculty met and determined that the pilot instrument needed to be changed to gather more accurate results. Students seemed confused by the questionnaire and we felt the results were not valid enough.

Intended Outcome(s):1. (continued)

Assessment Criteria:1.e. All students in the introductory level required courses for all CIS programs (101 and 117) will be given a set of five questions to be graded with the final exam. Students completing their final courses in CIS will be given 10 questions.

Results:1.e.1. Fall 2000: Data was collected and reviewed for CIS 101 and CIS 117. 143 students answered questionnaires in 101 with an average score of 84%. 39 students answered questionnaires in 117 with an average score of 90%.1.e.2.. Spring 2001: Data was collected at the end of the semester for CIS 101 and CIS 117. 105 students for CIS 101 had an average score of 86%. 41 students for CIS 117 had an average score of 96%.1.e.3. Fall 2001: Data was collected from CIS 101 and CIS 117. 118 students for CIS 101 had an average score of 86%. 38 students for CIS 117 had an average score of 98%.

Analysis and Action:1.e.2.. Spring 2001:Overall scores for CIS 101 improved by 2%. The weakest question in CIS 101 was identified. 25% of students missed the question about how to save files using Save vs. Save As. Instructors were encouraged to spend more time on this topic and the question was reworded to be easier to read for the next semester’s assessment test. Overall scores for CIS 117 improved by 3%.1.e.3. Fall 2001:Overall scores for CIS 101 stayed the same as the previous semester. The rewording of the question about saving indicated that fewer instructors were thoroughly teaching the concept of saving vs. the save as command. 29% of the students answered the question about saving incorrectly. A memo was sent out to all instructors outlining what students need to learn in CIS 101 pertaining to the save and save as command. Scores for CIS 117 improved by 2%.

Appendix 8 46

Pre/Post Tests Capstone exam/project Primary Trait Analysis Course Embedded Test Standardized Exams Professional Certification Portfolios Performance Assessment Other

Focus Groups Grad Surveys/Interviews Employer/Faculty Surveys

Page 50: Student Learning Outcomes—A Focus on Results

Appendix 9 – Assessment Plan Examples Internet Sites

North Carolina State Universityhttp://www2.acs.ncsu.edu/UPA/assmt/resource.htmContains Comprehensive list of Links to National Assessment Forums, Manuals & Handbooks, Student Learning, & other Institutions

California State University, Fresnohttp://www.csufresno.edu/cetl/assessment/assmnt.html & http://www.csufresno.edu/cetl/assessment/status.htmlContains Links to Program Assessment Plans

Boise State Universityhttp://www2.boisestate.edu/iassess/outcomes/outcomes.htmContains Links to Program Assessment Plans organized by college

Oklahoma State Universityhttp://www.okstate.edu/assess/assessment_plans/assessment_plans.htmContains Assessment method examples & Assessment Plan tips & checklist

California State University at Sacramentohttp://www.csus.edu/acaf/assmnt.htmContains listing of program Assessment Plan links

San Jose State Universityhttp://www.sjsu.edu/ugs/assessment/as-main.htmlContains Program Assessment Plans organized by college & a page containing links to other institutions.

Southeast Missouri State Universityhttp://www2.semo.edu/provost/aspnhtm/busy.htmBusy Chairperson's Guide to Assessment (Table of Contents)

Southern Illinois Universityhttp://www.siue.edu/~deder/assess/depts.htmlContains Program Assessment Plans

Ohio University Student Learning Outcomes Assessment , 2000-2001http://www.ohiou.edu/provost/OUTCOMES2000_2001.html

Central Michigan Universityhttp://www.provost.cmich.edu/outcomes/ Student Learning Outcomes by College for each major

Appendix 9 47

Page 51: Student Learning Outcomes—A Focus on Results

Appendix 10 – Activity 8 – Program SLOs from Competency StatementsSort the following Business program competencies into three categories and then write a global program

student learning outcome for each of the three categories.

Program Competency CategoryA. Analyze management theories and their application within the business environment.B. Analyze special challenges in operations and human resource management in

international business.C. Analyze the characteristics, motivations, and behaviors of consumers.D. Analyze the elements of the marketing mix, their interrelationships, and how they are

used in the marketing process.E. Analyze the influence of external factors on marketing.F. Analyze the management functions and their implementation and integration within the

business environment.G. Analyze the role of marketing research in decision making.H. Apply communication strategies necessary and appropriate for effective and profitable

international business relations.I. Apply marketing concepts to international business situations.

Explain the concepts, role, and importance of international finance and risk management.

J. Apply operations management principles and procedures to the design of an operations plan.

K. Describe the elements, design, and purposes of a marketing plan.L. Describe the environmental factors that define what is considered ethical business

behavior in a global business environment.M. Describe the interrelatedness of the social, cultural, political, legal, and economic factors

that shape and impact the international business environment.N. Describe the role of organized labor and its influence on government and business.O. Develop personal management skills to function effectively and efficiently in a business

environment.P. Examine the issues of managing in the global environment.Q. Explain the role of international business; analyze how it impacts business at all levels

(including the local, state, national, and international levels).R. Identify forms of business ownership and entrepreneurial opportunities available in

international business.S. Recognize the customer-oriented nature of marketing and analyze the impact of

marketing activities on the individual, business, and society.T. Relate balance of trade concepts to the import/export process.From the National Standards for Business Education © 2001 National Business Education Association, 1914 Association

Dr., Reston, VA 20191.

Category #1 Title: Program SLO #1:

Category #2 Title: Program SLO #2:

Category #3 Title: Program SLO #3:

Appendix 10 48

Page 52: Student Learning Outcomes—A Focus on Results

Appendix 10 49

Page 53: Student Learning Outcomes—A Focus on Results

Program Assessment Report Program: _______________________________ Term & Year: ______________

Methods of AssessmentStrategies/techniques/instruments for collecting the feedback data that provide an evidence of the extent to which objectives are reached

Type E=Enter

I=IntermediateX=Exit

F=Follow-up

Program Student Learning OutcomesCheck the # of the SLO assessed by the particular assessment method

Findings/Evaluation/ConclusionsResults of

analysis and interpretation

of the measurement

data

Recommendations for ImprovementRecommended

actions for Improving the

ProgramA1 A2 A3 A4 A5 A6 A7 A8

Graduate Exit Survey XEmployer Satisfaction Survey FExit interviews of graduates X5-Year Graduate Survey FAlumni survey FAdvisory Committee feedback IPeer evaluation of teachingStudent evaluations of teachers I

Appendix 11 – Example of Program Assessment ReportWest Virginia State Community and Technical College

http://fozzy.wvsc.edu/ctc/program_assesment/General%20Education%20Audit%20Grid.doc

Appendix 11 50

Page 54: Student Learning Outcomes—A Focus on Results

Methods of AssessmentStrategies/techniques/instruments for collecting the feedback data that provide an evidence of the extent to which objectives are reached

Type E=Enter

I=IntermediateX=Exit

F=Follow-up

Program Student Learning OutcomeCheck the # of the SLO assessed by the particular assessment method

Findings/Evaluation/ConclusionsResults of

analysis and interpretation

of the measurement

data

Recommendations for ImprovementRecommended

actions for Improving the

ProgramA1 A2 A3 A4 A5 A6 A7 A8

Students internship program IFaculty participation programs with industry (summer appointments)Analysis of enrollment/graduation dataInternal reviews External reviews (ABET Accreditation)Peer reviewDropout and Non-Complete RateCommunity Assessment NeedsLicensure/Certification Practice TestFaces of the Future SurveysStudent GPA

Appendix 11 51

Page 55: Student Learning Outcomes—A Focus on Results

Appendix 12 – General Education Student Learning OutcomesWest Virginia State Community and Technical College

http://fozzy.wvsc.edu/ctc/program_assesment/GeneralEducationCoreLearningOutcomes.htm

GENERAL EDUCATION CORE LEARNING OUTCOMES Graduates will be able to: Communicate articulately in speech and writing.

Think critically about issues, theory, and application. Use effective human relationship skills to work in a diverse society. Function effectively and positively in a team environment. Use library print and electronic resources for literature research. Use computational skills to solve problems, manipulate and interpret numerical data, and

communicate data in a logical manner Employ fundamental principles of science, the scientific method of inquiry, and skills for applying

scientific knowledge to practical situations. Use computer technology to organize, access, and communicate information.

GENERAL EDUCATION STUDENT LEARNING OUTCOMESCOCONINO COMMUNITY COLLEGE

COMMUNICATION SKILLS Present ideas developed from diverse sources and points of view with consideration of target audience. Demonstrate communication process through idea generation, organization, drafting, revision, editing, and presentation. Participate in and contribute to collaborative groups. Construct logical, coherent, well-supported arguments. Employ syntax, usage, grammar, punctuation, terminology, and spelling appropriate to academic discipline and the professional world. Demonstrate listening / interpretive skills in order to participate in communications and human exchange.

THINKING SKILLS Use appropriate method of inquiry to identify, formulate, and analyze a current or historical problem/question (may include recognizing

significant components, collecting and synthesizing information, evaluating and selecting solution(s), applying and defending solution(s). Translate quantifiable problems into mathematical terms and solve these problems using mathematical or statistical operations. Interpret graphical representations (such as charts, photos, artifacts) in order to draw appropriate conclusions Recognize strengths and weaknesses in arguments Demonstrate observational and experimental skills to use the scientific method to test hypotheses and formulate logical deductions Understand the uses of theories and models as applied in the area of study Develop creative thinking skills for application in problem solving Demonstrate a working knowledge of a technological application in an area of study.

DIVERSITY AND GLOBAL PERSPECTIVE Recognize the diversity of humanity at the local, regional and global levels Synthesize information about needs, concerns and contributions of different cultures within society Identify the influence of cultural and ethnic backgrounds on individual and group attitudes and values. Link cultural perspectives, practices, and interactions with the societal and physical environment from which they arose. Explain the importance of cross-cultural influences on physical, cultural and spiritual heritage. Relate and explain the connections between past and present events and/or issues.

AESTHETIC PERSPECTIVE Analyze and evaluate literary, visual, or performing arts using discipline-specific approaches and criteria. Reflect on personal responses to aesthetic experiences. Incorporate aesthetic reflection into discipline-specific activities.

ETHICAL AND CIVIL VALUES Identify and assess community needs and the responsibility to balance individual and societal needs Display responsibility and integrity in one's choices and actions Integrate knowledge in order to establish an ethical position on an issue and defend it with logical arguments Develop an appreciation of education and lifelong learning Understand social values and analyze their implications for the individual, community, society, and world. Recognize the individual's responsibility to continue the exploration of the changing world and one's role in it.

Appendix 12 52

Page 56: Student Learning Outcomes—A Focus on Results

Assessment of General Education Learning OutcomesAn "Institutional Portfolio" Approach to Assessment of General Education Learning Outcomes

What Comprises an "Institutional Portfolio" A collection of student work ("artifacts") produced throughout the curriculum for each of six

major outcomes: Mathematics, Writing, Speaking, Culture and Ethics, Modes of Inquiry, Problem Solving

Reviewed by faculty teams using holistic scoring criteria (rubrics) Results are compiled, analyzed, and reported in the aggregate by the Office of Institutional

Research Results are reported to the Faculty Assessment Committee which, in turn, reports to the

Educational Affairs Committee Faculty acts on assessment results

Characteristics of the "Institutional Portfolio" Model The outcomes and scoring teams are multidisciplinary thus "responsibility" rests with the

institution/faculty as a whole, rather than single departments It is invisible to students, obviating the motivation and other significant problems with

standardized tests It is minimally intrusive for faculty It requires no special "sessions," no sacrifice of class time (e.g. for testing), no external incentives for students to

perform well It is labor intensive and requires significant institutional resources (faculty release time and/or

overload pay, technical support) It is a dynamic process It's "messy"

Assessment Plan Logistics Who Scores: Four-to-six person interdisciplinary faculty teams How Scored: Individually by team members or as a group How Many Artifacts: 100 per outcome per year When Scored: Fall artifacts in spring; spring artifacts in fall Who Selects Courses: Office of Institutional Research Who Selects Artifacts: Faculty in each targeted class Who Collects, Copies, Distributes Artifacts: Office of Institutional Research

For more information contact: Jeff Seybert, Director, Research, Evaluation, and Instructional Development Johnson County Community College12345 College Boulevard Overland Park, KS 66210-1299 (913) 469-8500 ext. 3442 [email protected]

http://www.jccc.net/home/depts/6111/site/assmnt/cogout

Appendix 12 53

Page 57: Student Learning Outcomes—A Focus on Results

Mathematics Outcome

Outcome Statements: Upon receipt of an associate degree from Johnson County Community College, a student should be able to:

1. Identify relevant data (numerical information in mathematical or other contexts) bya. extracting appropriate data from a problem containing extraneous data and/or

b. identifying appropriate data in a word problem. 2. Select or develop models (organized representations of numerical information, e.g., equation, table, graph) appropriate

to the problem which represent the data by a. arranging the data into a table or spreadsheet and/or b. creating pictorial representations (bar graphs, or pie charts, or rectangular coordinate graphs, etc.) with or

without technological assistance and/or c. selecting or setting up an equation or formula.

3. Obtain and describe results by a. obtaining correct mathematical results, with or without technological assistance and b. ascribing correct units and measures to results.

4. Draw inferences from data by a. describing a trend indicated in a chart or graph, and making predictions based on that trend and/or b. describing the important features of data presented in a table or spreadsheet, and making predictions based on

that trend and/or c. describing the important features of an equation or formula, and making predictions based on those features

and/or d. making reasonable estimates when given problems involving quantities in any organized or disorganized

form and/or e. drawing qualitative conclusions about the original situation based on the quantitative results that were

obtained.

The mathematics outcomes consist of four major outcomes, numbered 1 to 4. These major outcomes are each subdivided into several subpoints labeled by letters. A major outcome is demonstrated when at least one subpoint has been demonstrated, except for major outcome 3, where subpoint 3.a. must be demonstrated. A subpoint is demonstrated when at least one instance of the subpoint has occurred, except for subpoints 3.a. (which requires at least 70 percent accuracy of the items examined) and 3.b. (which requires at least 2 instances involving different measures).

Rubrics: The following rubric will measure the mathematics outcomes: 5 = All four major outcomes are demonstrated by the use of more than one subpoint per major outcome. 4 = All four major outcomes are demonstrated. 3 = Three major outcomes are demonstrated.2 = Two major outcomes are demonstrated. 1 = Only one major outcome is demonstrated. 0 = No major outcomes are demonstrated.

Standards: At least 75 percent of all JCCC students earning associate degrees should obtain a score of 4 or more on the mathematics outcomes rubric. At least 95 percent of all JCCC students earning associate degrees should obtain a score of 3 or more on the mathematics outcomes rubric.

Writing Outcome

Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able to write a clear, well-organized paper using documentation and quantitative tools when appropriate.

Outcome Rubrics:6 = Essay demonstrates excellent composition skills including a clear and thought-provoking thesis, appropriate and effective organization, lively and convincing supporting materials, effective diction and sentence skills, and perfect or near perfect mechanics including spelling and punctuation. The writing perfectly accomplishes the objectives of the assignment. 5 = Essay contains strong composition skills including a clear and thought-provoking thesis, although development, diction, and sentence style may suffer minor flaws. Shows careful and acceptable use of mechanics. The writing effectively accomplishes the goals of the assignment. 4 = Essay contains above average composition skills, including a clear, insightful thesis, although development may

Appendix 12 54

Page 58: Student Learning Outcomes—A Focus on Results

be insufficient in one area and diction and style may not be consistently clear and effective. Shows competence in the use of mechanics. Accomplishes the goals of the assignment with an overall effective approach. 3 = Essay demonstrates competent composition skills including adequate development and organization, although the development of ideas may be trite, assumptions may be unsupported in more than one area, the thesis may not be original, and the diction and syntax may not be clear and effective. Minimally accomplishes the goals of the assignment. 2 = Composition skills may be flawed in either the clarity of the thesis, the development, or organization. Diction, syntax, and mechanics may seriously affect clarity. Minimally accomplishes the majority of the goals of the assignment. 1 = Composition skills may be flawed in two or more areas. Diction, syntax, and mechanics are excessively flawed. Fails to accomplish the goals of the assignment.

Standards: Ten percent of students who have met the requirements for an associate degree at JCCC will earn 6 (excellent) on each of the communication rubrics. Thirty percent of students earning an associate degree will score 5 (very good) or 6 (excellent). Eighty percent will earn scores of 4 (satisfactory) or higher and the top 98 percent will earn scores of 3 (minimal accomplishment of educational goals) or higher. The remaining 2 percent of the associate degree recipients are expected to earn the score of 2 (unsatisfactory) on the communication rubrics The score of 1 represents a skill level beneath the expectation of all associate degree recipients at JCCC. Hence, no associate degree recipients are expected to score at the level of 1 on the communications rubrics.

Speaking Outcome

Outcome Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able to make a clear, well-organized verbal presentation.

Rubrics:Very good/excellent (5-6) = The communicator presents a message that is exceptionally appropriate for the purpose, occasion, and audience with a purpose that is exceptionally clear and identifiable. The message is supported using material that is exceptional in quality and variety. The communicator uses an exceptionally clear and coherent organizational structure, provides a logical progression within and between ideas, and uses language that is exceptionally clear, vivid, and appropriate. The communicator makes exceptional use of vocal variety in a conversational mode; has exceptional articulation, pronunciation, and grammar; and demonstrates physical behaviors that provide exceptional support for the verbal message. Satisfactory (3-4) = The communicator presents a message that is appropriate for the purpose, occasion, and audience with a purpose that is adequately clear and identifiable. The message is supported using material that is appropriate in quality and variety. The communicator uses a reasonably clear and coherent organizational structure, provides a logical progression within and between ideas, and uses language that is reasonably clear, vivid, and appropriate. The communicator makes acceptable use of vocal variety in a conversational mode; has acceptable articulation, pronunciation, and grammar; and demonstrates physical behaviors that provide adequate support for the verbal message. Unsatisfactory (1-2) = The communicator presents a message that is not appropriate for either the purpose, occasion, or audience or is without a clear and identifiable purpose for the message. The message is supported with material that is inappropriate in quality and variety. The communicator fails to use a clear and coherent organizational structure, does not provide a logical progression within and between ideas, and uses unclear or inappropriate language. The communicator fails to use vocal variety; fails to speak in a conversational mode; fails to use acceptable articulation, pronunciation, and grammar; or fails to use physical behaviors that provide adequate support for the verbal message.

Standards: Ten percent of students who have met the requirements for an associate degree at JCCC will earn 6 (excellent) on each of the communication rubrics. Thirty percent of students earning an associate degree will score 5 (very good) or 6 (excellent). Eighty percent will earn scores of 4 (satisfactory) or higher and the top 98 percent will earn scores of 3 (minimal accomplishment of educational goals) or higher. The remaining 2 percent of the associate degree recipients are expected to earn the score of 2 (unsatisfactory) on the communication rubrics The score of 1 represents a skill level beneath the expectation of all associate degree recipients at JCCC. Hence, no associate degree recipients are expected to score at the level of 1 on the communications rubrics.

Appendix 12 55

Page 59: Student Learning Outcomes—A Focus on Results

Culture and Ethics Outcome

Outcomes Statements: Upon receipt of an associate degree from Johnson County Community College, a student should be able to:

1. Demonstrate a fundamental knowledge of world geography. 2. Demonstrate knowledge of the major cultural issues of a person's own culture as well as other cultures.3. Demonstrate knowledge of major historical events affecting one's culture and other cultures. 4. Demonstrate familiarity with contemporary global issues. 5. Demonstrate an understanding of major ethical concerns.

Rubrics: Demonstrates knowledge of world geography:

4 = Compares and contrasts geographies and their relationship to their respective cultures.3 = Analyzes the relationship between geography and culture. 2 = Analyzes the relationship between geography and economy. 1 = Identifies major characteristics of political and natural geography.

Demonstrates knowledge of the major cultural issues of a person's own culture as well as other cultures: 4 = Compares and contrasts cultural issues affecting one's culture and other cultures.3 = Analyzes major cultural issues. 2 = Identifies major cultural issues in other cultures. 1 = Identifies major cultural issues from one's culture.

Demonstrates knowledge of major historical events affecting one's culture and other cultures: 4 = Compares and contrasts historical events affecting one's culture and other cultures. 3 = Analyzes major historical events. 2 = Identifies major historical events in other cultures. 1 = Identifies major historical events in one's culture.

Demonstrates familiarity with contemporary global issues. 4 = Compares and contrasts the effect of global issues on cultures.3 = Analyzes contemporary global issues.2 = Identifies several contemporary global issues. 1 = Identifies a contemporary global issue.

Demonstrates an understanding of major ethical concerns: 4 = Develops a comprehensive, rational argument for an ethical position and describes its implications for personal and social behavior. 3 = Analyzes an ethical issue, the pro and con positions and its consequences, and the issue's relation to other ethical issues. 2 = Identifies the ethical dimensions of academic disciplines.1 = Identifies a general ethical issue.

Standards: The standard of judgment is 60 percent of the students will score 2 or higher on each outcome.

Modes of Inquiry Outcome

Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able to demonstrate understanding of the modes of inquiry by identifying an appropriate method of accessing credible information and data resources; applying the selected method; and organizing results.

Rubrics: Every artifact will be evaluated to determine whether the student has demonstrated the ability to perform each rubric item. The rubric items have been separated for modes of inquiry and for problem solving, and each artifact will be given scores for either or both areas, as appropriate. The following rubric will measure the modes of inquiry outcomes:

1. Identifies an appropriate method of accessing credible information and data resources. 2. Applies the selected method. 3. Organizes results.

If an artifact presents evidence that a student demonstrated the ability to perform a rubric, the artifact will be given a plus (+) score for that rubric.

If an artifact presents evidence that a student did not demonstrate the ability to perform a rubric, the artifact will be given a minus (-) score for that rubric.

Appendix 12 56

Page 60: Student Learning Outcomes—A Focus on Results

If it appears that the assignment did not present an opportunity for students to perform a rubric, the artifact will be given a zero (0) score for that rubric. For example, this may be a result of instances where the instructor=s assignment defined the problem or method of gathering information. The subcommittee scorers should concur on those particular rubrics which receive zeros.

Artifacts scored for Modes of Inquiry must allow the student to perform at least 2 of the 3 rubrics. Only rubrics with plus or minus scores will be counted. A zero score is not counted and does not impact the outcome standard. It is not necessary for the subcommittee scorers to concur on rubrics which receive plus or minus scores. The artifacts are scored as follows:

3 = the student demonstrated the ability to perform all rubrics that the student had the opportunity to perform (3 or 2). 2 = the student was given the opportunity to perform all 3 rubrics and demonstrated the ability to perform 2 of them. 1 = the student demonstrated the ability to perform only one rubric. 0 = the student was unable to demonstrate the ability to perform any of the rubrics.

Standards: At least 80% of the Modes of Inquiry artifacts should receive a score of 3.

Problem Solving Outcome

Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able to demonstrate understanding of solving problems by recognizing the problem; reviewing information about the problem; developing plausible solutions; and evaluating results.

Rubrics: Every artifact will be evaluated to determine whether the student has demonstrated the ability to perform each rubric item. The rubric items have been separated for modes of inquiry and for problem solving, and each artifact will be given scores for either or both areas, as appropriate. The following rubric will measure the problem-solving outcomes:

1. Recognizes the problem. 2. Reviews information about the problem. 3. Develops plausible solutions. 4. Evaluates results.

Every artifact will be evaluated to determine whether the student has demonstrated the ability to perform each rubric item. The rubric items have been separated for modes of inquiry and for problem solving, and each artifact will be given scores for either or both areas, as appropriate.

If an artifact presents evidence that a student demonstrated the ability to perform a rubric, the artifact will be given a plus (+) score for that rubric. If an artifact presents evidence that a student did not demonstrate the ability to perform a rubric, the artifact will be given a minus (-) score for that rubric. If it appears that the assignment did not present an opportunity for students to perform a rubric, the artifact will be given a zero (0) score for that rubric. For example, this may be a result of instances where the instructor=s assignment defined the problem or method of gathering information. The subcommittee scorers should concur on those particular rubrics which receive zeros.

Artifacts scored for Problem Solving must allow the student to perform at least 3 of the 4 rubrics. Only rubrics with plus or minus scores will be counted. A zero score is not counted and does not impact the outcome standard. It is not necessary for the subcommittee scorers to concur on rubrics which receive plus or minus scores. The artifacts are scored as follows: 4 = the student demonstrated the ability to perform all 4 rubrics. 3 = the student demonstrated the ability to perform 3 rubrics. 2 = the student was given the opportunity to perform 3 rubrics and demonstrated the ability to perform 2 of them. 1 = the student was given the opportunity to either perform 4 rubrics and demonstrated the ability to perform 1 or 2 of

them or perform 3 rubrics and demonstrated the ability to perform only 1 rubric. 0 = the student was unable to demonstrate the ability to perform any of the rubrics.

Standards: At least 80% of the Problem Solving artifacts should receive a score of 3.

Appendix 12 57

Page 61: Student Learning Outcomes—A Focus on Results

West Virginia State Community and Technical CollegeGENERAL EDUCATION CORE-AUDIT GRID

B. GENERAL EDUCATION LEARNING OUTCOMES

General Education Courses (listed in required sequence for student progression through program)1. 2. Written and/or Oral Communications 3. Mathematics 4. Natural Science

General Education Learning Outcomes

COLL 101

ENGL 101

ENGL 102

ENGL 112

ENGL 160

ENGL 204

BST230

COMM 100

MATH 100

MATH 101

MATH 102

MATH 121

BST 104

CHEM 101

CHEM 130

PHYS 103

PHYS 110

PHYS 120

PHYS 170

PHYS 191 &

203

PHYS 201 &

203

BIO 101

BIO 102

BIO 210

B.1 Communicate articulately in speech and writing

I I E

E R

E R

E R

R A R I

EI E I

I E R A

I A A I

AI A

I E

I E

B.2 Think critically about issues, theory, and application

I E I E

RE R R E E E E E I

E I I I I R

I E E E

A I I I

B.3 Use effective human relationship skills to work in a diverse society

I E I R A A

I E A

I E

B.4 Function effectively and positively in a team environment

I I E

I E

I E

E R E I

RI E A I I A I

AI A

I E A

I

B.5 Use library print and electronic resources for literature research

I I I E

I E R

A A II E R A

I I I A

I A I I

B.6 Use computational skills to solve problems, manipulate and interpret numerical data, and communicate data in a logical manner

I R A

R A

R A

R A

R A

I E I I

E I

I E R A

I E R A

I E R

I E R

I I E

B.7 Employ fundamental principles of science, the scientific method of inquiry, and skills for applying scientific knowledge to practical situations

I E I

I E R A

I I

I E R A

I E R A

I E R A

I E

I E

B.8 Use computer technology to organize, access, and communicate information

I I E

I E

I E

E R A

EA

I A E I I

Appendix 12 58

Page 62: Student Learning Outcomes—A Focus on Results

GENERAL EDUCATION CORE-AUDIT GRID-CONTINUED5. 6. Social Science 7. Information Skills

General Education Learning Outcomes

HUM 101

SOCL 101

POSC 100

POSC 101

PSYC 151

HIST 207

HIST 208

ECON 201

ECON 202

ET 112

CS 106

BST 240

ITEC 101

B.1 Communicate articulately in speech and writing A A A A A I I A E

A A

B.2 Think critically about issues, theory, and application

I A

IA A A E E E I

A I A A I I I

B.3 Use effective human relationship skills to work in a diverse society

E R

ER E E E R R

B.4 Function effectively and positively in a team environment

A A I

B.5 Use library print and electronic resources for literature research

A A A A I E E R R R A

R A

B.6 Use computational skills to solve problems, manipulate and interpret numerical data, and communicate data in a logical manner

I I E R

E R E E

A E

B.7 Employ fundamental principles of science, the scientific method of inquiry, and skills for applying scientific knowledge to practical situations

IA I I R E E A

B.8 Use computer technology to organize, access, and communicate information

A A A A A A A R

A R E

E R A

R

I=Introduces E=Emphasizes R=Reinforces A=Applies Introduces-Student is not familiar with content or skill. Instruction concentrates on introducing students to the content area or skill. Emphasizes-Student should have brought basic content or skill to the course. Instruction concentrates on enhancing content/strengthening skill and adding new content material building more complex skills based on entrance competencyReinforces-Student brings reasonable knowledge/content/skill/Competency to the situation as a result of content or skill being taught and/or emphasized at some previous point in their educational career. Instructional activity continues to teach and build upon previous competency and reinforces content or skill competencyApplies-Student has knowledge/content/skill/Competency as a result of content or skill being taught and/or emphasized at some previous point in their educational career. Instructional activity applies a previously taught and/or emphasized content or skill.

Appendix 12 59

Page 63: Student Learning Outcomes—A Focus on Results

Appendix 13. Resources and References for Student Learning Outcomes Assessment

Good Practices“An Assessment Manifesto” by College of DuPage (IL) is an excellent values statement.

“9 Principles of Good Practice for Assessing Student Learning” by the American Association of Higher Education are the foundational principles of assessment.

“Palomar College Statement of Principles on Assessment” is a succinct two-page summary of assessment and how it is done at Palomar College (CA).

“Closing the Loop” -- seven misperceptions of SLOs and responses to each by Tom Angelo.“Five Myths of Assessment” by David Clement, Monterey Peninsula College published in

Inside English (Spring 2003) the newsletter of the English Council for California Two-Year Colleges (www.ecctyc.org). Expresses concern that SLOs will affect faculty evaluation, intrude on the classroom, diminish academic freedom, and lead to standards that are watered down and blandly uniform.

“The Case for Authentic Assessment” by Grant Wiggins, presented at the California Assessment Institute. The paper addresses several questions: What is authentic assessment? Why do we need to invest in these labor-intensive forms of assessment? Won’t authentic assessment be too expensive and time-consuming? Will the public have any faith in the objectivity and reliability of judgment-based scores?

“Is Accreditation Accountable? The Continuing Conversation Between Accreditation and the Federal Government” by the Council for Higher Education Accreditation (2003) provides a thorough discussion of the tensions between the federal government’s call for accountability for student learning and traditional process based peer review accreditation methods.

Establishing the Student Learning Outcomes Process“Assessment Plan/Progress Report” by Isothermal Community College (NC) explains the SLO

process well.“Developing an Assessment Plan to Learn about Student Learning” by Peggy Maki of AAHE

gives a tabular “Assessment Guide” which covers general steps in setting up a student learning outcome assessment process.

“Methods of Assessment of Student Learning” classifies SLO methods as Direct, Indirect and Outputs.

“Assessment—An Institution-Wide Process to Improve and Support Student Learning” by College of DuPage (IL) is a handbook which lays out the student learning outcomes process and roles in general terms.

“Defining and Assessing Learning: Exploring Competency-Based Initiatives” a report of the National Postsecondary Education Cooperative Working Group on Competency-Based Initiatives in Postsecondary Education published by the National Center for Educational Statistics, September 2002. Section 4 on Principles of Strong Practice is particularly useful, giving twelve principles clustered in four areas: planning for competency-based education initiatives; selecting assessment methods; creating and ensuring that learning experiences lead to competencies; and reviewing assessment results to identify changes needed to strengthen student learning. The report concludes with eight case studies; of particular note are those of Sinclair Community College (OH) which has a flourishing competency-based initiative that “guarantees” competencies of graduates and Hagerstown Community College (MD) which uses a “career transcript” listing specific competencies.

“Assessment at the Program Level” by Trudy H. Bers. Notable features: 1) summarizes ten approaches to program assessment, 2) discusses challenges to implementation, 3) describes good practices at six community colleges.

60

Page 64: Student Learning Outcomes—A Focus on Results

Narratives of Faculty Experiences with Student Learning Outcomes“Using Rubrics” by Michelle Christopherson of the Modesto (CA) Junior College English

Department“Course-Based Assessment in English at Riverside Community College” by Arend Flick“Course Level Assessment Currently Being Used: Why Turn Towards Them?” by Lisa

Brewster of the San Diego Miramar College Speech Department“ Does Assessment of Student Learning Outcomes Make a Difference? One Department’s

Experience” by Jerry Rudmann, Irvine Valley College

Program Assessment“Displaying Sociological Imagination” at College of DuPage (IL) gives process and results for

assessing sociology (and shows the need for inter-rater reliability).“Guide to Outcomes Assessment of Student Learning” at CSU Fresno is a “how to” guide.“Undergraduate Program Assessment Plan for Anthropology at CSU Fresno.” Methods: pre/post

test, writing rubric, embedded exam questions, student survey.“Outcomes Assessment Plan” for Math at CSU San Bernardino. Method: embedded exam

questions.“Outcomes Assessment Status Report” for Communications at CSU San Bernardino. Methods:

portfolio, intern job performance.“Outcomes Assessment Status Report” for Nursing at CSU San Bernardino. Methods: clinical

supervisor evaluations, exit survey (commercial vendor), embedded full tests (commercial vendor).

Parkland College Academic Program Assessments: 1) Theatre, Methods: performance assessment and grad surveys; 2) Accounting, Methods: college-produced end-of-course exams and performance assessment, grad surveys.

The Geneva College (PA) “Program Guide” has a good example of a program audit.North Carolina State’s document “Data for Program Outcomes Assessment” gives Engineering program competencies and assessment.

General Education Assessment“General Education Assessment Pilot Project” by Coconino Community College (FL) wrote

general education learning outcomes and identified which courses covered them. They also describe how the college gave CAAP exams in reading and writing, with a SWOT analysis.

In the “Assessment Plan/Progress Report” Isothermal Community College (NC) established student learning outcomes in 1) Communications (reading, writing, speaking, listening), 2) Information Literacy, 3) Problem Solving, 4) Interpersonal Skills, 5) Quantitative Skills, and 6) Cognitive Skills. (MJC Institute Exercise: Write observables for these SLOs.) Each of the Isothermal GE skills areas has a rubric with a 1-4 scale (but not observables for each level except for Quantitative Skills).

“Summary of Two Years of CAAP Assessment” at College of DuPage (IL) gives comparisons to national norms on 6 tests (writing, reading, math, critical thinking, science reasoning, essay). Students self-reported on progress in three other general education areas: understanding and appreciating culture, understanding and appreciating environment, developing a system of personal values.

“Benchmarks for Core Skills” at Palomar College (CA) gives six general education competency sets: Communication, Cognition, Information Competency, Social Interaction, Aesthetic Responsiveness, Personal Development and Responsibility. The document has Demonstrated Competencies in three categories: Beginner, Developing, and Accomplished.

61

Page 65: Student Learning Outcomes—A Focus on Results

General Education Core-Audit Grid from the University of West Virginia Community and Technical College. Each of the college’s eight general education skills are identified on a matrix that lists all courses within the five categories of GE courses, coding the level of mastery as either I for Introduces, E for Emphasizes, R for Reinforces, or A for Applies.

“Assessment of General Education Learning Outcomes: An ‘Institutional Portfolio’ Approach to Assessment of General Education Learning Outcomes” Johnson County Community College. This document defines the “institutional portfolio,” gives the logistics of implementation, and then lists six GE outcome statements, each with detailed competencies, rubrics and standards.

Summary of Results from Student Outcomes Assessment - Spring 2002 and 2003, Mesa (AZ) Community College Office of Research and Planning. Mesa CC uses a student test sampling approach to SLO assessment. This document details their GE outcome statements in seven areas and summarizes the testing results.

Writing Measurable OutcomesThe Geneva College (PA) “Program Guide” has good examples of writing measurable

outcomes.The “Assessment Primer” by the FLAG Project stresses deep learning by connecting

Curriculum, Instruction and Assessment (CIA). Particularly strong on matching goal with assessment tool.

“Learning Outcomes: Learning Achieved by the End of a Course or Program: Knowledge – Skills – Attitudes” By Shirley Lesch, George Brown Toronto City College. The ABC’s of learning outcomes in nine easy-to-read pages.

Tools of AssessmentOverview

“Advantages and Disadvantages of Assessment Techniques” by Barbara Wright (8/15/02, presented at a California Assessment Institute workshop). Covers plusses and minuses of portfolios, capstone courses and projects, performance assessments, embedded assessment, classroom research and assessment, locally developed tests and commercial standard tests.

Rubrics: How-To Guides“The Use of Scoring Rubrics for Assessment and Teaching” by Mary Allen of CSU’s Institute

for Teaching and Learning is a three-page summary of what they are, how to create them, and how to use them. An example is included on assessment of oral presentations. She also has a six-page version entitled “Developing and Applying Rubrics” which has considerably more detail.

“Primary Trait Analysis: Anchoring Assessment in the Classroom” by Ruth Benander, Janice Denton, Deborah Page and Charlotte Skinner, Raymond Walters College (OH), from JGE: Journal of General Education,Vol. 49, No.4, 2000.

Rubrics:: Examples“Map Rubric” is a scoring tool for the Online Map Creation web site

(www.aquarius.geomar.de/omc).“Grading Standards: Written Work for ‘The Living Environment’ BIOL 111” is a rubric for

writing in Biology (A-F scales with definitions) at Southern Illinois University.“Student Participation Assessment and Evaluation” is a rubric with 4-point scales: frequently,

occasionally, seldom, almost never, used at Southern Illinois University.“Assessing Modeling Projects in Calculus and Precalculus” by C. E. Emenaker of the University

of Cincinnati gives a math project problem with two scoring rubrics: analytic and holistic.

62

Page 66: Student Learning Outcomes—A Focus on Results

“Scientific Report Rubric” and “Collaboration Rubric” developed for the Cabrillo Tidepool Study.

“Rubric For Evaluating Web Sites” originally developed by John Pilgrim, Horace Mann Academic Middle School, San Francisco

“Secondary Assessment Tools” is a web site with links to several dozen simple Performance Assessment rubrics (http://www.bcps.org/offices/lis/models/tips/assess_sec.html) “Student Learning Outcomes in the California State University” is a web site that gives links to

about 50 scoring rubrics (http://www.calstate.edu/AcadAff/SLOA/links/rubrics.shtml). Examples include the Scoring Guide for the CSU English Placement Test (EPT) and CSU Fresno rubrics on Critical Thinking, Integrative Science, and Writing.

Portfolios“Individual Student Tracking Project” gives a brief explanation of what portfolios are and how

to use them. From Palomar College (CA).Classroom Assessment Techniques

“Classroom Assessment: A Manual for Faculty Developers” by the National Council for Staff, Program and Organizational Development (NBCSPOP) is a step-by-step manual for putting on a CATs workshop.

Embedded AssessmentThe Journal of Chemical Education produces a “Chemical Concepts Inventory” which is a 22

question nationally normed multiple-choice test on basic chemistry concepts.The “Field-tested Learning Assessment Guide” (FLAG Project) has good examples (problems,

tests, surveys) in science, math, engineering and technology (copy of this list is provided).“Course Embedded Assessment Process” developed by Larry Kelley at University of Louisiana

Monroe gives a summary of course embedded assessment, examples of ten program assessment plans, lays out the basics of rubrics, and gives several rubric templates.

Local California Community College Training and Resource MaterialsModesto Junior College (CA) held training institute in the summer of 2003 for 36 faculty and

staff entitled “Measuring Student Learning Outcomes.” The document includes activities for writing measurable objectives, writing student learning outcomes starting with existing course objectives, how to embed assessment in a course, the basics of rubric writing, and how to construct a program assessment plan. MJC is also holding a summer training institute in 2004 with an activity and resource guide entitled “Student Learning Outcomes—A Focus on Results.”Bakersfield College (CA) has assisted the majority of its faculty in writing student learning outcomes for their courses. Faculty leaders Janet Fulks and Kate Pluta have put together a resources manual entitled “Assessing Student Learning” that guides faculty through the process of writing SLOs, including definitions, criteria, good and bad examples, and SLOs from their own courses. The document also covers how to include all three learning domains: cognitive, psychomotor, and affective. The document concludes with the story of how the college ramped up the SLO process, a summary of achievements to date, a philosophy statement on SLOs adopted by the Academic Senate, and a listing of web resources.

State and National Standards on Academic & Vocational CompetenciesWelding Codes & Standards (www.aws.org/cgi-bin/shop) “AWS is recognized worldwide for

the development of consensus-based American National Standards. Over 170 standards -- as codes, recommended practices, guides and specifications. Certification is offered in seven different welding processes.”

63

Page 67: Student Learning Outcomes—A Focus on Results

Business Education Standards (www.nbea.org/curfbes.html) “Using the concepts described in these standards, business teachers introduce students to the basics of personal finance, the decision-making techniques needed to be wise consumers, the economic principles of an increasingly international marketplace, and the processes by which businesses operate. In addition, these standards provide a solid educational foundation for students who want to successfully complete college programs in various business disciplines.”

California Code of Regulations, Title 16. Professional And Vocational Regulations §1443.5. Standards of Competent Performance (http://www.calnurse.org/cna/np/brn/standard.html) “A registered nurse shall be considered to be competent when he/she consistently

i

Endnotes? “Is Accreditation Accountable” by the Council for Higher Education Accreditation (2003) provides a

thorough discussion of the tensions between the federal government’s call for accountability for student learning and traditional process based peer review accreditation methods: http://www.chea.org/pdf/CHEAmonograph_Oct03.pdf

ii Lisa Brewster’s approach is summarized in “Course Level Assessment Currently Being Used: Why Turn Towards Them?” (October 2003) presented at the Student Learning Outcomes workshop sponsored by the RP Group: http://cai.cc.ca.us/SLOworkshops/Strand2/Brewster%20on%20Speech%20SLOs.doc

iii Janet Fulks’ SLOs are on the web at http://www2.bc.cc.ca.us/bio16/Student%20Learning%20Outcomes.htm and her grading rubrics are at http://www2.bc.cc.ca.us/bio16/projects_and_grading.htm

iv For an excellent short article on this topic see “The Case for Authentic Assessment” by Grant Wiggins at http://ericae.net/edo/ED328611.htm

v SLO Good Practice Statements: Palomar College: http://www.palomar.edu/alp/principles.html College of DuPage: http://www.lgu.ac.uk/deliberations/assessment/manifest.html American Association of Higher Education: http://www.aahe.org/assessment/principl.htm vi Primary Trait Analysis definition is from “Integrating the Assessment of General Education into the

Classroom—A Two-Year College Model” by Ruth Benander and Janice Denton of Raymond Walters College and Barbara Walvoord of University of Notre Dame presented at the Annual Meeting of the North Central Accrediting Association in April of 1997: http://www.rwc.uc.edu/phillips/Assessment/NCApaper.html

See also Effective Grading: A Tool for Learning and Assessment. Walvoord, Barbara E. and Virginia J. Anderson. San Francisco: Jossey-Bass Publishing, Inc. 1998; and “Primary Trait Analysis: Anchoring Assessment in the Classroom” by Benander, Denton, Page and Skinner; Journal of General Education, Vol. 49, No 4, 2000.

vii Mary Allen at CSU Fresno has written a succinct two pages of advice on the “Use of Rubrics” that is well worth reading: http://www.calstate.edu/acadaff/sloa/links/using_rubrics.shtmlFor a more detailed commentary on rubrics, see “Developing and Applying Rubrics” by Ethelynda

Harding, also of CSU Fresno, presented at a chemistry conference in March of 2004: http://www.csufresno.edu/cetl/Events/Events%2003-04/ChemConf/Rubrics.pdf

viii “Assessing Modeling Projects In Calculus and Precalculus: Two Approaches” by Charles E. Emenaker, University of Cincinnati, Raymond Walters College: http://www.maa.org/saum/maanotes49/116.html

ix Summary of direct assessment methods taken from “A Glossary of Measurement Terms” ERIC Digest. http://ericae.net/edo/ed315430.htm and the Temple University “Teachers Connection.” www.temple.edu/CETP/temple_teach/ and the NCIIA Assessment Workshop. www.nciia.org/CD/public/htmldocs/papers/p_and_j.pdf

x “A Private Universe” Schneps, M. H. and P. M. Sadler (1987) Harvard-Smithsonian Center for Astrophysics, Science Education Department, Science Media Group, A Private

64

Page 68: Student Learning Outcomes—A Focus on Results

demonstrates the ability to transfer scientific knowledge from social, biological and physical sciences in applying the nursing process in accord with the six enumerated standards.”

“Undergraduate Psychology Learning Goals and Outcomes” This document is the work of the Task Force on Undergraduate Psychology Major Competencies appointed by the American Psychological Association’s Board of Educational Affairs. The report provides details for 10 suggested goals and related learning outcomes for the undergraduate psychology major. These represent what the Task Force considers to be reasonable departmental expectations for the psychology major in United States' institutions of higher education.

“Information Literacy Competency Standards for Higher Education” by the Association of College and Research Libraries (ACRL). Each of the five standards come with detailed performance indicators.

Assessment Plan Examples—Internet SitesNorth Carolina State University: http://www2.acs.ncsu.edu/UPA/assmt/resource.htm. Contains

comprehensive list of links to national assessment forums, manuals & handbooks, student learning, & other institutions.

California State University, Fresno: http://www.csufresno.edu/cetl/assessment/assmnt.html & http://www.csufresno.edu/cetl/assessment/status.html . Contains links to program assessment plans.

Boise State University: http://www2.boisestate.edu/iassess/outcomes/outcomes.htm . Contains links to program assessment plans organized by college.

Oklahoma State University: http://www.okstate.edu/assess/assessment_plans/assessment_plans.htm . Contains assessment method examples and assessment plan tips and checklist.

California State University at Sacramento: http://www.csus.edu/acaf/assmnt.htm . Contains listing of program assessment plan links.

San Jose State University: http://www.sjsu.edu/ugs/assessment/as-main.html . Contains program assessment plans organized by college and a page containing links to other institutions.

Southeast Missouri State University: http://www2.semo.edu/provost/aspnhtm/busy.htm . Busy chairperson's guide to assessment (table of contents).

Southern Illinois University: http://www.siue.edu/~deder/assess/depts.html . Contains program assessment plans.

Ohio University: http://www.ohiou.edu/provost/OUTCOMES2000_2001.html . Student learning outcomes assessment , 2000-2001.

Central Michigan University: http://www.provost.cmich.edu/outcomes/ . Student learning outcomes by college for each major.

Universe. Video. Washington, DC: Annenberg/CPB: Pyramid Film and Video, 18 minutes.xi For commentary on informal norming sessions on an English writing rubric see “Using Rubrics” by

Michelle Christopherson of Modesto Junior College: http://cai.cc.ca.us/SLOworkshops/Strand2/Using Rubrics.doc

xii Undergraduate Psychology Major Learning Goals And Outcomes: A Report, American Psychological Association (March 2002): http://www.apa.org/ed/pcue/taskforcereport2.pdf

xiii Taken from “A Handbook on Assessment for Two Year Colleges” by Ed Morante of College of the Desert: http://cai.cc.ca.us/Fall2002Institute/2002/assessmenthandbookfinal.doc

xiv Examples of Course Level Assessment Plans: Raymond Walters College: http://www.rwc.uc.edu/phillips/Assessment/AcadAssess.html California State University, Fresno, Anthropology:

http://www.csufresno.edu/cetl/assessment/Programs/Anthropology/AnthroPlan.pdf 65

Page 69: Student Learning Outcomes—A Focus on Results

Links can be found on the web version: http://cai.cc.ca.us/workshops/SLOFocusOnResults.doc

66