Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
Accreditation Seminar
Jennifer Michael, EdD., RT(R)
February 28, 2020
Atlanta, Georgia
1
Housekeeping Items
Bathrooms
Breaks
Lunch
CE Credit
- Sign in
- Survey
Ice Breaker
JRCERT Mission Statement
4
The JRCERT promotes excellence in education and elevates quality and safety of patient care through
the accreditation of educational programs in radiography, radiation therapy, magnetic resonance, and
medical dosimetry.
Board of Directors
Bette Schans, Ph.D., R.T.(R), FASRT - ChairLorie Zelna, M.S., R.T.(R)(MR) -1st Vice Chair
Julie Lasley, Ph.D., R.T.(R)(T)
2nd Vice ChairChad N. Hensley, III, M.Ed., R.T.(R)(MR)Secretary/Treasurer
Beverly J. Felder, M.P.A.
Tracy L. Herrmann, M.Ed., R.T.(R)
Jason W. Stephenson, M.D.
Lisa Schmidt, Ph.D., R.T.(R)(M), CRT
Mahsa Dehghanpour, Ed.D., CMD
JRCERTProfessional Staff
Leslie F. Winter, M.S., R.T.(R)
Chief Executive Officer
Traci Lang, M.S.R.S., R.T.(R)(T)
Executive Associate Director
Tricia Leggett, D.H.Ed., R.T.(R)(QM), FASRT
Director of Instructional Design and Technology
Brian Leonard, M.B.A., R.T.(R)
Accreditation Specialist
Jennifer Michael, Ed.D., R.T.(R)
Accreditation Specialist
Jason Mielcarek, M.A.M.Ed.
Accreditation Assistant
Tim Ebrom, M.S.
Accreditation Assistant
JRCERTSupport Staff
Teresa Cruz
Finance Manager
Paul Lund, B.A.
Information Technology Administrator
Angie Mielcarek
Senior Executive Assistant
Janet Murzyn
Accreditation Services Coordinator
Janet Luczak, B.S.
Administrative Assistant
Joanne Sauter, B.M., B.A.
Administrative Assistant
Meagan Cruz
Office Assistant
JRCERT Program Statistics (January 2020)
Radiography– 604
Radiation Therapy - 71
Magnetic Resonance - 14
Medical Dosimetry - 19
2019
Accreditation Awards
8 Year – 705 Year – 323 Year – 8Probation -6
Website Resources
10
Website Resources
Website Resources
Website Resources
Website Resources
Website Resources
Resources Update
Interim Report Module
Interim Report Checklist
JRCERT Accreditation (Student Focused)
Outcomes Assessment
Understanding of Program Effectiveness Data
Calculating Program Effectiveness Module
Portal Instructional Videos (under FAQs)
Activity Update
• Full Standards Revision• Updates available on the JRCERT Web Site• Final Draft Approval by Board in April Board Meeting• Fully implemented January 2021
• Site Visit Checklist (Program & Faculty)• Flat Fee of $900 per site visitor• Program responsible for site visitor hotel expense• Invoices can now be paid online
• “The Link” ~ JRCERT’s Online Learning Management System
LINK Learning Innovation Networking Knowledge
SELF- STUDY REPORT
Initial Accreditation
Continuing Accreditation
Initial Accreditation
1st: Programs must provide
Institutional accreditation
State authorization to
offer post-secondary level
education
Qualified Program
Director and Clinical
coordinator (if applicable)
Appropriate Clinical Settings
Initial Accreditation
2nd: Portal Access
Pay initial application fee
Access to Portal
Documentation for all clinical settings to be recognized
Documentation for program officials to be recognized
Completion of Self Study Report (6-month submission timeframe)
Continuing Accreditation Timeline
23
1 year from projected Site visit date, program will receive “Greetings letter”
Self-study submission due in 6 months
Site visit within 6 months of Self-study review
Site Visit Team report submitted to the JRCERT following site visit
Continuing Accreditation Timeline
24
JRCERT Report of Findings within 3 months
Program response to the JRCERT within 6-8 weeks
Board of Directors Meeting
Accreditation award letter
Progress Report or Interim Report –if applicable
Expectations
25
Demonstration of compliance with standards & objectives
Self-evaluation of
program
Identification of strengths
and weaknesses
Plan for addressing identified
issues
STANDARDSThere are 6 standards. Each standard is titled and includes a
narrative statement supported by specific objectives. Each objective, in turn, includes the following clarifying elements:• Explanation – provides clarification on the intent and key details
of the objective.
• Required Program Response – requires the program to provide a brief narrative and/or documentation that demonstrates compliance with the objective.
• Possible Site Visitor Evaluation Methods – identifies additional materials that may be examined and personnel who may be interviewed by the site visitors at the time of the on-site evaluation to help determine if the program has met the particular objective. Review of additional materials and/or interviews with listed personnel is at the discretion of the site visit team.
26
Required Program
Response
• Assurance
• Objective 1.6: Submit section of Student Handbook to confirm program has a grievance policy.
• Narrative
• Objective 1.5: Describe how the program assures security and confidentiality of student records, etc.
• Assurance and Narrative
• Objective 4.2: Submit section of Student Handbook that contains the pregnancy policy and describe how the policy is made known to students.
Providing Narratives & Documentation
• Narratives in the portal are capped to 3,000 characters per response
• Documentation uploaded to the portal is capped at 15 documents per response.
• Professional staff reserve the right to return the self study report back to the program to comply with these requirements
This Photo by Unknown Author is licensed under CC BY-ND
Standard Summary
29
Strengths
Concerns
Plan for Addressing Concern(s)
Progress
Constraints
AMS View of Self-Study
30
AMS View of Self-Study
31
AMS View of Self-Study
32
AMS View of Self-Study
33
AMS View of Self-Study
34
Self-Study Preparation Process Considerations
35
Involve communities of interest
Develop plan for self-study process
Involve someone unfamiliar with your program for clarity
Be concise but complete
Use samples for exhibits –recommended organization of the report
Things to Consider:
36
DO NOT ASSUME THE JRCERT ALREADY HAS
MATERIAL OR DOCUMENTS
BE SO CONCISE THAT THE SVT DOES NOT GUESS.
INVOLVE FACULTY IN THE PROCESS.
SITE VISIT:Scheduling
PurposeTeam Assignment
Pre-Site Visit CommunicationOn-Site Evaluation
Site Visit: Scheduling
• Dates are determined after the Self-Study is reviewed
• Site Visit Scheduling Form
• Program notified by JRCERT Accreditation Services Coordinator
38
Site Visit: Purpose
39
• Application material
• Self-study Report
Validate
• Program’s personnel, facilities and resources in support of its mission and goals
Evaluate
• Relationship between program efforts and requirements of objectives
Assess
SV Team Assignment
40
Minimum
of 2
Conflict of interest
Geographic considerations
Sponsorship considerations
Apprentice participation
Communications During Site Visit
• Team chair contacts program director to establish agenda
• Communications shift from Professional Staff to Team Chair
• Following visit, communication shifts back to the JRCERT office
41
This Photo by Unknown Author is licensed under CC BY-NC-ND
Site Visit
• Two (2) days
• Tour sponsoring institution (classrooms, learning resources, etc)
• Visit selected clinical sites
• Interviews with administration, faculty, clinical instructors, and students
42
This Photo by Unknown Author is licensed under CC BY-NC-ND
This Photo by Unknown Author is licensed under CC BY-NC-ND
Pre-exit Interview Meeting with Program Director
REPORT OF FINDINGS
(ROF)
Report of FindingsThe Official Report is based on:
45
Self Study Report
Report of Site Visit
Team Findings
Staff review of relevant materials
Official Report
ROF with citation
ROF Citation
47
Based on the documentation submitted by the program and the findings of the site visit team, the program appears to be in substantial compliance, at the time of the site visit, with Objectives 4.1, 4.2, 4.3, 4.7, and 4.8. The program is not in compliance with Objectives 4.4, 4.5, and 4.6.
• The program is not in compliance with the following:
• Objective 4.4 – Assures that medical imaging procedures are performed under the direct supervision of a qualified radiographer until a student achieves competency.
• Objective 4.5 – Assures that medical imaging procedures are performed under the indirect supervision of a qualified radiographer until a student achieves competency.
• Objective 4.6 – Assures that students are directly supervised by a qualified radiographer when repeating unsatisfactory images.
Program Response
to ROF
48
Narrative
• Describe the procedures for making the students, CIs, and staff award of supervision policies.
Assurance
• Provide updated policies and assurance that students, CIs, and staff have been made aware of the update.
Program Response
to ROF
49
Be concise, but complete
Provide narrative and
documentation
Evidence of implementation is
important
Response is submitted thru
the Portal
E-mail sent to the CEO or President
for electronic signature.
Questions about the Portal? Refer
to Portal Helps/You Tube
videos and FAQs.
**Direct questions to JRCERT Professional Staff member that
developed the ROF.
AMS -Program Response to ROF
50
Program Response to ROF
51
Package for Board Consideration
52
Previous ROF
Current ROF
Current Award Letter
Program’s response to current ROF
Staff recommendation
Accreditation Award Levels
❖Based on review of program package
❖Determined by Board of Directors
• Initial:• Withhold
• 18 months (minimum)
• 3 year (maximum)
• Continuing:• 8 years (maximum)
• 5 years with/without progress report
• 3 years with/without progress report
• Probation
53
This Photo by Unknown Author is licensed under CC BY-SA-NC
Compliance TimeframeProgram Length
2 year or longer
1 year
Compliance Timeframe
24 months
18 months
Failure to demonstrate compliance, or identify mitigating circumstances within the specified time period, will result in Involuntary Withdrawal of Accreditation.
PROGRESS REPORTS
Progress Report
Program Officials Should:
56
Make the connection between initial recommendation and narrative in Report of Findings
Understand first response was inadequate in some way
Contact professional staff for clarification
Be clear
Provide documentation; evidence of implementation important
INTERIM REPORTS
Interim Report
Required of programs with maximum
accreditation award
➢includes :
• basic program information
• elements of Standards One, Two, Four, Five, and Six
➢Board of Directors’ Accreditation action:
• 8-year award maintained or
• award reduced and review process expedited
Resources
• Interim Report Modules
• http://www.jrcert.org/programs-faculty/learning-modules/
• Interim Report Checklist
• http://www.jrcert.org/interim-report-checklist/
59
Interim Report Objectives
•Objective 1.10
•Objective 2.9
•Objective 4.1
•Objective 4.2
•Objective 4.4
•Objective 4.5
•Objective 4.6
60
•Objective 5.1
•Objective 5.4
•Objective 5.5
•Objective 6.1
•Objective 6.2
•Objective 6.5
Compliance for Supervision
• Describe how:
• students, clinical instructors, and clinical staff are made aware of the supervision requirements.
• the program’s supervision requirements are monitored and enforced in the clinical education setting.
• Provide:
• representative samples of instruments (e.g., clinical evaluations, student surveys) that document the monitoring and enforcement of supervision policies.
• copies of memos to students, clinical instructors, and clinical staff; and/or meeting minutes that document discussion of the supervision requirements. 61
Extra Considerations
62
Provide Representative Samples –Completed or Blank copies are
acceptable. Document…Document…Document.
STANDARDS
Radiography
• Standard One - Integrity
The program demonstrates integrity in the following: representations to communities of interest and the public, pursuit of fair and equitable academic practices, and treatment of, and respect for, students, faculty, and staff.
• Objective 1.2: Provides equitable learning opportunities for all students
64
MammographyPosition Statement:
65
Radiography, Radiation Therapy, and Medical Dosimetry
• Standard Four - Health and Safety
The program’s policies and procedures promote the health, safety, and optimal use of radiation for students, patients, and the general public.
• Objective 4.1: Assures the radiation safety of students through the implementation of published polices and procedures that are in compliance with Nuclear Regulatory Commission regulations and state laws as applicable.
66
Radiography, Radiation Therapy, and Medical Dosimetry
• Interpretation: All students who participate in using equipment in an energized laboratory or clinical environment must be monitored for radiation exposure, including but not limited to simulation procedures or quality assurance.
• Adopted by the Joint Review Committee on Education in Radiologic Technology: 04/15(effective 04/15)
67
Magnetic Resonance
• Standard Four - Health and Safety
The program’s policies and procedures promote the health and safety for students, patients, and the general public.
• Objective 4.1: Makes available to students and the general public accurate information about potential workplace hazards associated with magnetic fields.
68
Radiography, Radiation Therapy, and Medical Dosimetry
• Standard Four - Health and Safety
The program’s policies and procedures promote the health, safety, and optimal use of radiation for students, patients, and the general public.
• Objective 4.3: Assures that students employ proper radiation safety practices.
69
Radiography, Radiation Therapy, and Medical Dosimetry
• Interpretation: Programs must establish a safety screening protocol for students having potential access to the magnetic resonance environment. This assures that students are appropriately screened for magnetic wave or radiofrequency hazards. Programs must describe how they prepare students for magnetic resonance safe practices and provide a copy of the screening protocol, if applicable.
• Adopted by the Joint Review Committee on Education in Radiologic Technology: 10/14 (effective 10/14)
70
71
72
Take advantage of resources:
Assessment Corner Your institution Google
What is Assessment?
Assessment is the systematic collection, review, and use of information to improve student learning and educational quality -JRCERT, Standard Five – Objective 5.1 based on Palomba and Banta’s definition (Assessment Essentials, 1999)
Assessment Essentials 2nd Ed (2015)
The process of providing credible evidence of resources, implementation actions, and outcomes undertaken for the purpose of improving the effectiveness of instruction, programs, and services in higher education.
73
What is Student Learning Outcomes Assessment?
•The ongoing process of
1. Establishing clear, measurable, expected SLOs2. Systematically gathering, analyzing, and interpreting
evidence to determine how well students’ learning matches expectations
3. Using the resulting information to understand and improve student learning
4. Reporting on processes and results
74
Assessment Involves:
75
MAKING YOUR EXPECTATIONS EXPLICIT AND PUBLIC
USING THE RESULTING INFORMATION TO DOCUMENT,
EXPLAIN, AND IMPROVE PERFORMANCE
Goal of Assessment?
• Information-based decision making
• “The end of assessment is action”
• Do not attempt to achieve the perfect research design… gather enough data to provide a reasonable basis for action.
Wolvoord (2010)
Pitfalls of Assessment
• Compliance with external demands
• Gathering data no one will use or data that is required (# of comps, dosimeter)
• Making the process too complicated
77
Course Grades
• Course grade cannot pinpoint concepts that students have or have not mastered
• Grading Criteria
• Attendance, Participation, Bonus points
• Inter-rater reliability or vague grading standards
• Not holistic
• Do grades have a place in an Assessment program?
78
Curriculum MapCourses Student Learning Outcomes
SLO 1 SLO 2 SLO 3 SLO 4
RAD 150
RAD 153 I I I
RAD 154 R I I
RAD 232 R R R R
RAD 234 R R
RAD 250 M M M & A M
RAD255 M & A M & A A M & A
79
“I” = Introduce“R” = Reinforce, practice“M” = Mastery“A” = Assessed for program assessment
Types of Assessment
Student Learning
What students will do or achieve
• Knowledge
• Skills
• Attitudes
Program Effectiveness
What the program will do or achieve
• Certification Pass Rate
• Job Placement Rate
• Program Completion Rate
• Graduate Satisfaction
• Employer Satisfaction
80
Types of Assessment
Formative Assessment
•Gathering of information during the progression of a program.
•Allows for student improvement prior to program completion.
Summative Assessment
•Gathering of information at the conclusion of a program.
81
82
Mission Statement
Goals
SLOs
MISSION STATEMENT
• Mission Statement - The program's mission statement should clearly define its purpose and scope and is periodically reevaluated.
• Is the program's mission statement consistent with the focus of the institution's mission?
• Is it easily understood?
• Does it reflect what is expected from graduates?
83
84
• broad statements of student achievement that are consistent with the mission of the program
• should address all learners and reflect clinical competence, critical thinking, communication skills, and professionalism
Goals Should NOT:
85
CONTAIN ASSESSMENT TOOLS
CONTAIN INCREASES IN ACHIEVEMENT
CONTAIN PROGRAM ACHIEVEMENTS
Goals ?
86
The program will prepare graduates to function as entry-level ___.
The faculty will assure that the JRCERT accreditation requirements are followed.
Students will accurately evaluate images for diagnostic quality.
85% of students will practice age-appropriate patient care on the mock patient care practicum.
Student Learning Outcomes
• Specific
• Measurable
• Attainable
• Realistic
• Targeted
87
Student Learning Outcomes
88
Students will __________ _____________.
(action verb) (something)
The JRCERT suggests no more than 8-9 total SLOs.
89
90
KNOWLEDGE
COMPREHENSIONAPPLICATION
ANALYSISSYNTHESIS
EVALUATION
Cite
Count
Define
Draw
Identify
List
Name
Point
Quote
Read
Recite
Record
Repeat
Select
State
Tabulate
Tell
Trace
Underline
Associate
Classify
Compare
Compute
Contrast
Differentiate
Discuss
Distinguish
Estimate
Explain
Express
Extrapolate
Interpolate
Locate
Predict
Report
Restate
Review
Tell
Translate
ApplyCalculateClassify
DemonstrateDetermineDramatizeEmployExamineIllustrateInterpretLocateOperateOrderPracticeReport
RestructureScheduleSketchSolve
TranslateUseWrite
Analyze
Appraise
Calculate
Categorize
Classify
Compare
Debate
Diagram
Differentiate
Distinguish
Examine
Experiment
Inspect
Inventory
Question
Separate
Summarize
Test
Arrange
Assemble
Collect
Compose
Construct
Create
Design
Formulate
Integrate
Manage
Organize
Plan
Prepare
Prescribe
Produce
Propose
Specify
Synthesize
Write
Appraise
Assess
Choose
Compare
Criticize
Determine
Estimate
Evaluate
Grade
Judge
Measure
Rank
Rate
Recommend
Revise
Score
Select
Standardize
Test
Validate
Lower division courseoutcomes
91
KNOWLEDGE
COMPREHENSIONAPPLICATION
ANALYSISSYNTHESIS
EVALUATION
Cite
Count
Define
Draw
Identify
List
Name
Point
Quote
Read
Recite
Record
Repeat
Select
State
Tabulate
Tell
Trace
Underline
Associate
Classify
Compare
Compute
Contrast
Differentiate
Discuss
Distinguish
Estimate
Explain
Express
Extrapolate
Interpolate
Locate
Predict
Report
Restate
Review
Tell
Translate
ApplyCalculateClassify
DemonstrateDetermineDramatizeEmployExamineIllustrateInterpretLocateOperateOrderPracticeReport
RestructureScheduleSketchSolve
TranslateUseWrite
Analyze
Appraise
Calculate
Categorize
Classify
Compare
Debate
Diagram
Differentiate
Distinguish
Examine
Experiment
Inspect
Inventory
Question
Separate
Summarize
Test
Arrange
Assemble
Collect
Compose
Construct
Create
Design
Formulate
Integrate
Manage
Organize
Plan
Prepare
Prescribe
Produce
Propose
Specify
Synthesize
Write
Appraise
Assess
Choose
Compare
Criticize
Determine
Estimate
Evaluate
Grade
Judge
Measure
Rank
Rate
Recommend
Revise
Score
Select
Standardize
Test
Validate
Upper divisionCourse / Program
outcomes
Assessment Measurements
92
The most important criterion when selecting an assessment method is whether it will provide useful information - information that indicates whether students are learning and developing in ways faculty have agreed are important.
(Palomba & Banta, 1999)
Assessment Plan Review
Measurement Tools - Assessment best practices suggest the use, where appropriate, of two or more measurement tools for each SLO.
Measurement from multiple perspectives can often provide a more accurate picture of student learning.
• Do tools validate one another so that the data is accurate and reliable?
• Are enough measurement tools utilized to assure a valid picture on student achievement?
• Are there too many tools for each SLO and wasting time on collection of data? ~ Are the best tools available being used to measure the SLO? ~ Have they provided results that we believe accurately measure the SLO? ~ Should different tools be considered? ~ Should the existing tool be modified to improve the accuracy and validity of the results provided?
• Large enough sample size from each measurement tool to yield valid results? For example, results from an employer survey to assess critical thinking skills and only two surveys (the “n” number) were returned from a graduating class of fifteen, the data would not be sufficient to provide reliable assessment information from this tool.
• Identify the “n” number, i.e., the sample size when reporting the results. 93
Assessment Plan Review
94• Benchmarks - Programs must set the expectations for how well
the students are learning. If only a section/part of a measurement tool is used, then the program must be able to set a benchmark for that particular subsection of the entire measurement tool. • Is the benchmark consistent with the measurement tool? ~
If the scale is not a 100% scale has the scale being used been clearly identified?
• Benchmarks are reasonable expectations. If, for example, a “passing” benchmark is set at a 75% average for the entire cohort, that would mean some students are performing well below the acceptable “passing” level of 75%. The program may wish to consider setting the performance benchmark higher than the minimum “passing” level.
• Are benchmarks set at reasonable and acceptable levels? ~ Should benchmarks be set higher to reflect the true expectations for student learning? ~ If students are consistently meeting a benchmark, should the benchmark be increased?
• If a benchmark is raised, what must be done to improve the program in order to get the students to that higher level?
• Should we lower a benchmark if the benchmark is not met for several cycles of assessment? Programs should examine results over several cycles for trends, analyze the reasons for any unmet benchmark(s), and make modifications to improve student performance. Programs should “stretch” to reach the highest levels of student achievement possible before deciding to lower a benchmark.
Assessment Plan Review
• Timeframes
• Is the formative assessment timeframe appropriate?
• Summative assessment is used to determine if program graduates are at the achievement level consistent with the program’s mission. Should we establish any different timeframes for summative measurement? ~ Would feedback obtained post-graduation from graduates or employers be valuable in the assessment process?
• Individual Responsible
• Are the individuals responsible for collecting assessment data appropriately identified in the plan?
• Are the individuals identified the best resource or should someone else be identified to perform this task?
• Do these individuals understand the importance of their respective roles in the assessment process?
95
Assessment Plan Review Cont.
• Reporting Results - Assessment results should be reported in a format that is correlated with the benchmark. If the benchmark is based upon a Likert scale then the results should be reported using the same scale. “Actual” data must be reported. • Are we reporting “generalizations” rather than actual results? For example,
based on a benchmark: “All students will achieve a minimum 80%,” the results cannot be reported as, “All students received over 80%.” This is not actual data and does not indicate how well the cohort performed. Did the distribution of scores identify multiple students barely exceeding the minimum benchmark, or were scores concentrated at the upper end of the grading scale? If using a class average as the benchmark, report the actual average score.
• When reporting the data, are we also reporting the sample size (“n” number), i.e., the number of data inputs reviewed to determine the reported results?
96
Collect and Trend the Data
97
Report the actual data
• On assessment plan
• On separate document
Should facilitate comparison
• Comparison of cohorts
• Comparison of students attending certain clinical setting
Show dates
Data Analysis
• What does the data say about your students’ mastery of subject matter, of research skills, or of writing and speaking?
• What does the data say about your students’ preparation for taking the next career step?
• Do you see areas where performance is okay, but not outstanding, and where you’d like to see a higher level of performance?
98
UMass-Amherst, OAPA: http://www.umass.edu/oapa/oapa/publications/
Data Analysis
99
Identify benchmarks met
• Sustained effort
• Monitoring
• Evaluate benchmarks
01Identify benchmarks not met
• Targets for improvement
• Study the problem before trying to solve it!!
• Evaluate benchmark
02Identify 3 years of data (trend)
03
Assessment Plan Review
• Analysis of Assessment Results – the assessment plan’s value to the department lies in the evidence it offers about overall department or program strength and weaknesses, and in the evidence it provides for change (Wright, 1991).
• What does the data say about the students’ mastery of subject matter?
• Were benchmarks met?
• Are students prepared as graduates of a JRCERT accredited program?
• What are the areas of program strengths?
• What are the areas of program weaknesses?
• Formally documented
100
Ongoing Assessment
101
is cumulative
is fostered when assessment involves a linked series of activities undertaken over time
may involve tracking progress of individuals or cohorts
is done in the spirit of continuous improvement
Closing the Cycle
102
The process of drawing conclusions should be
open to all those who are likely to be affected by the results – the communities
of interest.
Analysis of the assessment data needs to be shared
and formally documented. For example, meeting
minutes from Assessment or Advisory Committee.
2021 Standards – We want your feedback!
Contact Information
[email protected] www.jrcert.org
20 North Wacker Drive, Suite 2850
Chicago, IL 60606-3182
(312) 704-5300
THANK YOU!!
Thank you for supporting excellence in
education and quality patient care through
programmatic accreditation.
Please take a moment . .
• https://www.surveymonkey.com/r/L8SY6XF