1
Evaluating Research Data Management Education: the good, the bad, and the ugly Regina Raboin Assoc. Dir. Research & Education [email protected] Amanda Rinehart Data Management Services [email protected] Amy Koshoffer Science Informationist [email protected] Tiffany Grant Research Informationist [email protected] Carlson, J., Sapp Nelson, M., Johnston, L.R., & Koshoffer, A. (2015) "Developing Data Literacy Programs:Working with Faculty, Graduate Students and Undergraduates " ASIS&T Bulletin 41(6). p.14-17. The Good: - Addressed professional needs and OSTP/Federal Agencies requirements (OSU, UC, UMMS) - Flexible curriculum development; modular, case studies, websites, customizable (UMMS) - Cultivating relationships (UC, UMMS, OSU) - Raised awareness – (UC, UMMS, OSU) - Individual consultations (UC, OSU) - Group presentations (UC) - Referrals (UC, OSU) The Bad: - Not specific to administrative student data, or unfunded research (UC) - Shorter sessions, less time between them (UC, UMMS) - Not enough tool/software coverage (OSU) - Too much content (UMMS) - Time concerns (UC, UMMS) - Updated content (UMMS) Addressing (the Ugly): - Pre-class surveys, customization (UMMS) - Discipline or audience-specific content (UC, OSU) - Pilot courses, NEMDMC MOOC (UMMS), Vet Med (OSU) As research data management services become more common so does evaluation of those services. Although there is some recent research about assessing faculty and student research data management skills, there is little that addresses assessment of the way we teach research data management (Carlson et al. 2015). We detail our own experiences with assessing data education activities and list the top five lessons learned. Photo courtesy of kunstfoto at https:// www.flickr.com/photos/darknetportal/1258650180 , altered to remove background, CC BY-NC-SA 2.0 Lessons Learned - Seek out and involve institutional collaborators. - Don’t be afraid to switch it up – you can tweak the next session based on the feedback from the last. - You can’t please everyone all the time – don’t try! Give up comprehensiveness in favor of low-hanging objectives; they will come back for more if you are immediately useful. - It’s OK to not know the answer – many times, no one knows the answer, because it hasn’t been determined yet –credibility is key! - Expect attrition from registration numbers; anything from 20-50% attendance is typical. - Be aware that attendees, depending on their expertise, may need a more customized learning experience. - Be flexible and don’t take any criticisms personally!

RDAP 16 Poster: Evaluating Research Data Management Education: the good, the bad, and the ugly

  • Upload
    asist

  • View
    93

  • Download
    2

Embed Size (px)

Citation preview

Page 1: RDAP 16 Poster: Evaluating Research Data Management Education: the good, the bad, and the ugly

Evaluating Research Data Management Education:

the good, the bad, and the ugly

Regina RaboinAssoc. Dir. Research & [email protected]

Amanda RinehartData Management Services [email protected]

Amy KoshofferScience [email protected]

Tiffany GrantResearch [email protected]

Carlson, J., Sapp Nelson, M., Johnston, L.R., & Koshoffer, A. (2015) "Developing Data Literacy Programs:Working with Faculty, Graduate Students and Undergraduates" ASIS&T Bulletin 41(6). p.14-17. 

The Good:- Addressed professional needs and OSTP/Federal

Agencies requirements (OSU, UC, UMMS)- Flexible curriculum development; modular, case

studies, websites, customizable (UMMS)- Cultivating relationships (UC, UMMS, OSU)- Raised awareness – (UC, UMMS, OSU) - Individual consultations (UC, OSU)- Group presentations (UC)- Referrals (UC, OSU)

The Bad: - Not specific to administrative student data,

or unfunded research (UC)- Shorter sessions, less time between them (UC,

UMMS)- Not enough tool/software coverage (OSU)

- Too much content (UMMS)- Time concerns (UC, UMMS)

- Updated content (UMMS)

Addressing (the Ugly):

- Pre-class surveys, customization (UMMS)- Discipline or audience-specific content (UC,

OSU) - Pilot courses, NEMDMC MOOC (UMMS), Vet

Med (OSU)

As research data management services become more common so does evaluation of those services. Although there is some recent research about assessing faculty and student research data management skills, there is little that addresses assessment of the way we teach research data management (Carlson et al. 2015). We detail our own experiences with assessing data education activities and list the top five lessons learned.

Photo courtesy of kunstfoto at https://www.flickr.com/photos/darknetportal/1258650180, altered to remove background, CC BY-NC-SA 2.0

Lessons Learned- Seek out and involve institutional collaborators.

- Don’t be afraid to switch it up – you can tweak the next session based on the feedback from the last.

- You can’t please everyone all the time – don’t try! Give up comprehensiveness in favor of low-hanging objectives; they will

come back for more if you are immediately useful.

- It’s OK to not know the answer – many times, no one knows the answer, because it hasn’t been determined yet –credibility

is key!

- Expect attrition from registration numbers; anything from 20-50% attendance is typical.

- Be aware that attendees, depending on their expertise, may need a more customized learning experience.

- Be flexible and don’t take any criticisms personally!