11
892 Academic Libraries, Government Information, and the Persistent Problem of Jargon Jennifer Kirk, Alex Sundt, and Teagan Eastman * e shiſt to born-digital and digitized materials has ultimately increased access and convenience for users, but in many ways it has also complicated the process of finding information. While users may struggle with catalog interfaces or reading call numbers, most have a basic understanding of how to locate a physical book. But in the digital environment, users have no built-in model for what sequence of clicks or keywords will get them to the information they need. is problem is exacerbated for specialized areas like government information, where more and more data and documents are readily available online via a variety of public web portals. Libraries oſten curate these portals using research guides or other domain-specific reference websites, providing major points of access for users. However, designing these specialized sites to be user-centered, rather than domain- centered, presents numerous challenges. For instance, how should the needs of different user groups be bal- anced? How should complex information be structured to support domain experts, while also helping orient and remove barriers for new users? Answering these questions is especially important to Utah State University Libraries (USU), which serve as a Regional Depository for the Federal Depository Library Program (FDLP). USU’s Government Information Department supports not only our community of 25,000 undergraduate and 3,000 graduate students, many of whom learn at a distance, but also local and regional communities as a matter of public access. To be successful, these users need to understand and be able to effectively navigate the “library within a library” that is government information. To support this broad community and their range of needs, our websites need to strike the right balance between straightforward, content-focused design and more sup- portive, instruction-heavy design. Literature Review is tension between meeting the needs of expert and novice users is illustrated well by the problem of jargon. Technical language and jargon have been persistent barriers between library users and their information needs. Drawing from language used in library handouts and reference transactions at Carnegie Mellon University, Naismith and Stein analyzed users’ comprehension of common examples of jargon and found that among 100 participants, the correct definition was identified less than 50% of the time. 1 Using similar methods, Chaudhry and Choo and Hutcherson reported progress, with participants choosing the correct definition 77% and 62% of the time respectively. 2,3 Common strategies participants employed to make sense of unfamiliar terms included * Jennifer Kirk, Government Information Librarian, Utah State University, [email protected]. Alex Sundt, Web Ser- vices Librarian, Utah State University, [email protected]. Teagan Eastman, Online Learning Librarian, Utah State University, [email protected]

Academic Libraries, Government Information, and the ... · Technical language and jargon have been persistent barriers between library users and their information needs. Drawing from

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

892

Academic Libraries, Government Information, and the Persistent Problem of Jargon Jennifer Kirk, Alex Sundt, and Teagan Eastman*

The shift to born-digital and digitized materials has ultimately increased access and convenience for users, but in many ways it has also complicated the process of finding information. While users may struggle with catalog interfaces or reading call numbers, most have a basic understanding of how to locate a physical book. But in the digital environment, users have no built-in model for what sequence of clicks or keywords will get them to the information they need. This problem is exacerbated for specialized areas like government information, where more and more data and documents are readily available online via a variety of public web portals. Libraries often curate these portals using research guides or other domain-specific reference websites, providing major points of access for users. However, designing these specialized sites to be user-centered, rather than domain-centered, presents numerous challenges. For instance, how should the needs of different user groups be bal-anced? How should complex information be structured to support domain experts, while also helping orient and remove barriers for new users? Answering these questions is especially important to Utah State University Libraries (USU), which serve as a Regional Depository for the Federal Depository Library Program (FDLP). USU’s Government Information Department supports not only our community of 25,000 undergraduate and 3,000 graduate students, many of whom learn at a distance, but also local and regional communities as a matter of public access. To be successful, these users need to understand and be able to effectively navigate the “library within a library” that is government information. To support this broad community and their range of needs, our websites need to strike the right balance between straightforward, content-focused design and more sup-portive, instruction-heavy design.

Literature ReviewThis tension between meeting the needs of expert and novice users is illustrated well by the problem of jargon. Technical language and jargon have been persistent barriers between library users and their information needs. Drawing from language used in library handouts and reference transactions at Carnegie Mellon University, Naismith and Stein analyzed users’ comprehension of common examples of jargon and found that among 100 participants, the correct definition was identified less than 50% of the time.1 Using similar methods, Chaudhry and Choo and Hutcherson reported progress, with participants choosing the correct definition 77% and 62% of the time respectively.2,3 Common strategies participants employed to make sense of unfamiliar terms included

* Jennifer Kirk, Government Information Librarian, Utah State University, [email protected]. Alex Sundt, Web Ser-vices Librarian, Utah State University, [email protected]. Teagan Eastman, Online Learning Librarian, Utah State University, [email protected]

simply guessing, drawing on meanings of words from other contexts, multiword unpacking, and morphological analysis, in which participants broke down terms to their component parts in order to deduce the meaning.4,5

Despite users’ desire for simplicity, technical language persists in academic libraries for the same reason it persists in other professions: specialized vocabularies provide a common shorthand that facilitates communica-tion around complex topics. While jargon is advantageous for specialists, it can be a significant hindrance for novice and intermediary users. In academic libraries, orienting oneself to technical language is often a necessary, if unacknowledged, step in understanding scholarship. For instance, “abstract,” which was only understood by 36% of users in Hutcherson’s study, is part of the academic lexicon and could not be easily replaced.6 In other cases, such as “OPAC,” terms may be meaningful only to librarians and continue to be used as a matter of con-venience irrespective of their impact on users.

Such terms present significant usability problems, especially within web interfaces. Kupersmith reported on a number of library usability studies, finding generally that natural, action-oriented terms were preferred by users over acronyms, branded names, and legacy terms like “catalog” and “pathfinder.”7 Rather than eliminate jargon altogether, most authors recommend balanced solutions like providing a glossary of commonly mis-understood terms,8 defining terms the first time they are used,9,10,11 using jargon appropriately in instruction sessions,12 and being sensitive to potential confusion in reference interviews and other patron interactions.13,14

Despite a large body of evidence showing that users misunderstand the terms librarians use, jargon such as “reference desk” and “catalog” persist when more usable alternatives have been well established. In addition, new and emerging services like scholarly communication and data management, as well as future services, present a nearly constant source of confusing new terms.15 Overcoming the barriers created by jargon, whether by finding more usable alternatives, or supporting users’ comprehension of terms through orientation and instruction, is crucial for helping users not only navigate library websites, but also engage in dialogue with the scholarly record and specific communities of practice in academia.

These problems are also greatly amplified in the government information environment, which confronts users with a range of specialized formats, technical and agency-specific terms, and frequent interface changes. Previous studies of FDLP depository libraries indicated that perceived user-emphasis on digital materials would end the age of separate government information collections or would introduce new responsibilities.16 Nine-teen years later, separate collections persist and government information librarians still curate digital mate-rials through library websites. In understanding this transition, members of the depository community have promoted staff training, emphasizing the role of library staff as a mediator for information access.17 In a 2009 user-needs study, Burroughs found that government information users prefer web-based (digital) formats; as a result, library services were transformed to provide communication with researchers and to build a federated search tool for the collection.18 Finding a way to organize and access information in the digital domain has com-plicated collection access for both librarians and researchers. Librarians have long turned to usability testing to solve these problems by understanding user preferences and designing library websites that match users’ mental models.19,20,21,22

Research QuestionsUSU Libraries utilizes a continuous design strategy for the library website, and in 2018 it was determined that the Government Information website (currently hosted in Springshare’s LibGuides system) could benefit from a redesign. While jargon and usability testing are a common focus of the library literature, few studies have fo-cused on best practices for designing government information collection websites or associated e-government websites.23,24,25 Some authors, such as Dowell, have undertaken similar design approaches focusing on other

Academic Libraries, Government Information, and the Persistent Problem of Jargon

APRIL 10–13, 2019 • CLEVELAND, OHIO

893

specialized collections including special collection websites, noting that specific terminology and navigation is-sues could be best understood through usability testing.26 Given the complex nature of government information websites, the study’s research team was guided by the following questions:

• How problematic is technical language within the domain of government information?• What strategies do users employ when they encounter confusing or unfamiliar terms?• What design approaches help novice users within the domain of government information reference? Obtaining answers to these questions would help the research team design a new government information

website that supports user preferences and minimizes barriers.

MethodsOur study was conducted in two stages in Fall 2018. For the first stage, we analyzed how well students could de-fine common jargon terms related to government information using a multiple-choice questionnaire. 15 terms (Table 1) were selected based on reference and instruction interactions with undergraduate and graduate stu-dents recorded from 2017–2018. The correct definitions were derived from two reference glossaries, while three additional incorrect choices were created by the authors for each term.27,28 Supplementary questions asked how much difficulty participants had in selecting a definition, the strategies they used when they were uncertain, what methods they preferred for getting help with confusing terminology, and how much of a barrier they perceived jargon to be generally. Participants were also asked to identify their year in school and rate how often they used the library website. Volunteers were asked to take a Qualtrics survey as they entered the library, which represents a heavily trafficked, mixed-use classroom, study and computer lab space. Because of this, we felt confident that our sample would include users with a range of experience and library knowledge, including non-users.

For the second stage, usability tests were conducted with four undergraduate students. Adapting the Hex UX methods described by Scott W.H. Young of Montana State University at the 2017 Designing 4 Digital conference in Austin, TX, participants were asked to perform three tasks on four separate government information library websites, including USU’s Government Information LibGuide.29 Websites were selected from academic librar-ies that participate in the FDLP as they serve similar populations to USU while upholding the commitment to public access inherent with FDLP participation. Selected websites represent a variety of visual layouts and infor-mation design approaches. This included sites that matched their main library website design, as well as those, like USU’s Government Information website, which used Springshare’s LibGuides system. Descriptions of each website are included in Table 2. Participants were asked to identify their year in school and rate how often they used the library website. Volunteers were recruited through print and digital advertising in the library.

TABLE 1Terms Used in Jargon Analysis

Government Information-Specific Terms General Library Terms

Legislative HistoryGovernment DocumentResolutionRegulationCase LawHearingSerial SetSuperintendent of Documents (SU DOC System)Technical Report

DatabaseCitationIndex Finding AidMicroformPrimary Source

Jennifer Kirk, Alex Sundt, and Teagan Eastman

ACRL 2019 • RECASTING THE NARRATIVE

894

A team of two study coordinators observed and took notes during usability sessions, which lasted around 30 minutes. Participants were given a laptop with a document linking to each of the four websites, which were or-dered differently for each session, and were asked to complete three usability tasks provided on a sheet of paper for each website (Table 3). In addition to observations and comments from participants elicited using the “think aloud” method, test sessions provided ample opportunities to talk with participants about their web design and help-seeking preferences.30 Sessions were recorded in QuickTime to allow for review.

ResultsJargon AnalysisIn all, 43 undergraduate students, one graduate student, and one high school student completed the jargon survey. Participants were generally frequent users of the library’s website, with 40% responding that they used the website “2-3 times a month” and 36% selecting “weekly.” Among the questions answered by all participants, 229 were answered accurately, meaning that participants appeared to understand the terms 34% of the time. On average, participants answered five questions accurately, with five also being the median number of correct an-swers. Only three participants selected eight or more correct definitions, meaning only 7% of participants could accurately define more than 50% of the terms. Twenty-six (58%) participants were only able to accurately define four (27%) or fewer of the 15 terms.

Table 4 lists terms from most to least understood. Because two definitions for “database” were consid-ered accurate, this term was the most well-understood, with 62% selecting one of two correct definitions.

TABLE 2Usability-Testing Website Selections

Website 1 Website 2 Website 3 Website 4

Site Type LibGuide Standard Website

LibGuide & Standard Website

Standard Website

Digital Collection Access? Yes Yes Yes YesAdditional Help Type Librarian

ContactLibrarian Contact

Librarian Contact General Contact

Instruction Element Type Description Description & Tutorial

Description Description

Navigation Type Top Bar Left Side Left Side Left SideNotes Includes links to

subject guidesCollection spread across multiple libraries; Chat pop-up

TABLE 3User Testing Usability Tasks

Task Number

Description

1 Find the location and/or call number for a physical government document named “Washington Monument Grounds: Cultural landscape report.”

2 Find out where you can get additional help with government information questions.3 Find a link to GovInfo.org/FDsys, the website that provides the official publications for all three

branches of the U.S. government.

Academic Libraries, Government Information, and the Persistent Problem of Jargon

APRIL 10–13, 2019 • CLEVELAND, OHIO

895

“Citation” was also very well-understood, with 27 participants (60%) selecting the correct definition. Among the other topmost terms, “index,” “finding aid,” “legislative history,” and “microform” were all understood by more than 40% of participants. Terms that were least-understood include “serial set,” “resolution,” “technical report,” “case law,” and “primary source,” with less than 25% of partici-pants selecting the correct answer. Overall, terms specific to government information seem to be poorly understood, with participants able to select an accurate answer only 25% of the time, whereas more general library terms were understood 47% of the time.

The questionnaire asked participants to rate how much difficulty they experienced in defining unfamiliar terms, and what strategies they used to select a definition. Overall, participants expe-rienced difficulty, with 67% selecting “somewhat difficult” and 31% selecting “very difficult.” Com-mon strategies used to choose a definition included deduction, multiword unpacking/morphological analysis (37%), guessing based on library/research experience (32%), and random selection (16%). Next, participants were presented with a list of support features and asked to select their preferred options. Results from this question are ranked from most to least selected in Table 5. An FAQ was the most frequently selected (29), followed closely by complete replacement of confusing terms (27), and a comprehensive glossary of terms (25). Participants preferred practical features that enabled them to find solutions independently, while high-touch support from library staff and an introductory video were far less popular. Four participants suggested their own solutions, which were all some variation on having a modal window pop-up to define commonly-confused terms.

Finally, we were curious about how much users perceived jargon to be a barrier to finding information or accomplishing tasks online. Only three participants responded that they found technical terms to be a huge barrier, while 16 participants respond-ed that they didn’t need to know every term to be successful. The majority of participants, 26 (57%), responded that they were somewhat confused but felt confident they could figure it out with some support.

TABLE 4Ranking of Terms from Most to Least Understood

Term No. of Accurate Answers

Database 28 (62%)Citation 27 (60%)Index 21 (47%)Finding Aid 20 (44%)Legislative History 19 (42%)Microform 19 (42%)Government Document 18 (40%)Hearing 16 (35%)Superintendent of Documents (SU DOC System)

14 (31%)

Regulations 12 (27%)Primary Source 11 (24%)Case Law 9 (20%)Technical Report 9 (20%)Resolution 4 (9%)Serial Set 2 (4%)

TABLE 5Ranking of User Support Features from Most to Least

Frequently Selected

Feature No. of responses

FAQ that defines common terms 29Provide short definitions whenever terms are used 29 Replace confusing terms with better ones 27Glossary 25Make it easier to contact person for help 9Short, intro video on webpage 8Provide personal assistance at point of need 7Teach this during class visits 4

Jennifer Kirk, Alex Sundt, and Teagan Eastman

ACRL 2019 • RECASTING THE NARRATIVE

896

Usability TestingFour undergraduate students participated in our usability tests: one sophomore, one junior and two seniors. Three of the participants responded that they used the library website weekly, and one participant indicated they used the library website a few times a semester. All participants indicated they were somewhat unfamiliar with government information.

The participants were asked to complete the same three tasks across four separate websites (see Table 3 for tasks), for a total of sixteen attempts for each task. For Task 1, participants successfully found a call number for

a specific item 81% of the time. For Task 2, par-ticipants successfully found government informa-tion contact or help information 87% of the time, though multiple participants noted they would have preferred a different mode for help. For Task 3, 87% of participants located the link to GovInfo.gov, though this typically took participants longer due to confusion over labels. While each website generally performed well for each of the tasks, observations and commentary from participants as they worked through each scenario provided rich insights into their preferences and general approaches to library research. Several themes emerged from these observations and elicitations (Table 6).

In particular, help-seeking behaviors stood out in discussions with participants. For Task 2, each par-ticipant commented that they preferred to look for self-guided solutions (e.g. FAQs) before reaching out for help. Several participants also noted that when they did want or need personal help, they preferred to use chat features because they expected to get a quicker response with fewer interaction costs. For example, one participant noted that email required private information and more formal communication, whereas for chat and phone calls, the transaction could move directly to the problem at hand. Although users liked the im-mediacy of chat, many found pop-up chat boxes off-putting and confrontational. Participants who preferred chat were also skeptical about the level of service they would get through the general help service. Website #4, for example, did not include any contact or help service specific to government information. Although a general “Ask” service was easily accessible through the main menu, one participant worried that this option would result in getting “bounced around.” Similarly, another participant was discouraged by a generic email address for government information that included no name or specific contact information for the specialist responsible for that area.

Issues surrounding website search features came up repeatedly. Multiple participants expressed confusion over the difference between searching the catalog, discovery layer, and library website. Not only were these op-tions often presented on the same page, many links directed users to external catalogs and search engines, creat-ing a disjointed experience. Participants were also confused by unclear branding of search engines, such as insti-tutionally branded catalog labels, and the repeated use of labels such as “government information” without clear contextual information on how these tools differed. Overall, participants seemed baffled by this multi-layered ecosystem. Two participants expressed a desire for a single-search experience that would cover all publications, including both catalog materials and specific government information databases.

TABLE 6Themes Identified across Usability Tests

Theme Mentioned by Participant(s)

Self-helpers 1, 2, 3, 4

Preference for chat 1, 4Skeptical of help service 1, 4Search issues 2, 3, 4Information overload, duplicate links 1, 2, 3, 4Need for consistent link descriptions 1, 2, 3, 4Preference for natural or task-based headings

1, 2, 3, 4

Competing navigation menus 1, 3, 4

Academic Libraries, Government Information, and the Persistent Problem of Jargon

APRIL 10–13, 2019 • CLEVELAND, OHIO

897

As the tests were set up to elicit comparisons, participants naturally commented on design aspects they preferred across the different sites. Multiple participants expressed a sense of being overwhelmed by the amount of content, in particular links, that they encountered when navigating. Specifically, websites built in LibGuides were difficult to scan due to confusing layouts, duplicated terminology (e.g. “government information”), and nu-merous links. This problem also extended to link descriptions, which varied across the tested websites. In some cases, participants were overwhelmed by lengthy descriptions, while in other cases descriptions were too vague, or not provided at all. This inconsistency, combined with poor link text, hindered participants’ ability to navigate and confidently make selections.

Many participants also commented on navigation menus. Participants seemed to prefer heading labels that were action-oriented, such as “Finding Government Information,” and which used natural, task-focused lan-guage. On the other hand, vague headings such as “Freely accessible” or headings based around federal, state, and international governments were less successful, as participants struggled to anticipate content or which level of government was appropriate for their needs. Menu design was also mentioned frequently. In particular, participants expressed some confusion when encountering multiple menu hierarchies within the same page, for example when main library navigation was provided at the top of the page versus left-hand navigation for in-page government information content. While participants generally understood the difference between these options, the menus competed for users’ attention, forcing them to pause and re-orient themselves.

Discussion Jargon AnalysisA number of factors seem to have influenced the poor comprehension of government-related terms. Popular strategies, like drawing from previous experience or meanings from other contexts, were generally unreliable for government-related jargon, particularly for terms like “Superintendent of Documents,” that deviate significantly from colloquial usage. Even drawing from past library experience was ineffective. For the term “serial set,” 62% of participants selected a definition that best described a bound journal. Misunderstandings also seemed to stem from competing definitions and ambiguous meanings of certain words. For example, most participants defined “primary source” as “the original author or source of a piece of information about a historical event.” While this usage is common for journalists, it deviates from definitions used in archival fields. Similarly, 20 participants (44%) defined government document as “a paper copy of a report written by federal, state, or local government staff,” suggesting that many users still associate “document” with print material. This suggests a need to provide contextual information to users, perhaps through modal window pop-ups that define how the terms are used in a government information context.

Survey participants were more successful in defining general library jargon, a result which may have more to do with changing technology than gains made through user education. For example, 60% selected the correct definition for “citation,” a strong improvement to Naismith and Stein’s finding of 35%,31 and slight improvement to later success rates of 55% and 51.7%.32,33 Popular adoption of tools like Google Scholar, which launched in 2004, could be one explanation for this increase in awareness. In contrast, the shift away from microform for-mats may account for lower comprehension for the term “microform,” which was understood by only 42% of participants, compared to 76% reported by Naismith and Stein and 67% reported for “microfilm” by Chaudhry and Choo.34,35

Not surprisingly, “database,” which included two correct definitions, was the most broadly understood term in our analysis, with eleven participants (24%) selecting a more library-centric definition, and seventeen (38%) selecting the broader, technical definition. While this lack of a clear definition may be problematic for commu-

Jennifer Kirk, Alex Sundt, and Teagan Eastman

ACRL 2019 • RECASTING THE NARRATIVE

898

nicating with library users, it reflects the intertwined nature of technology and the modern library. In the age of Google, users may not need a clear mental model of what a bibliographic database is in order to use them effectively. Instead, the emphasis has been on balancing users’ preference for task-oriented language with librar-ians’ desire to communicate a broad range of source types, hence the use of compound terms like “Articles and More (Databases).”36 While target words are certainly more usable, there is a clear role for library instruction that is focused less on what tools are called, and more on developing skills and helping students recognize when to adapt their search strategies.

Usability TestingAlthough they were generally able to complete the tasks provided for each site, problems with information overload and the presentation of links slowed down participants’ ability to scan. Large headers and font, as well as visually-attractive CSS, were both mentioned by participants as design elements that helped them feel less overwhelmed. Additionally, problems with link labels and descriptions resulted in poor selections or overlooked resources. To mitigate these problems, designers of government information and other domain-specific library websites should employ good information design principles to make it easier for users to navigate complex con-tent. Lists of links should be “chunked” under clear headings and be appropriately described to improve users’ ability to scan, and content should be balanced to avoid overwhelming users.

A broader area of concern is how government information and other specialized collection pages fit within the larger library website. Because these sites are commonly integrated as part of a main library website, there is potential confusion caused by competing navigation systems. In our tests, participants recognized that left-hand menus corresponded to page-specific content, while items placed at the top of the page corresponded to site-wide content. Working within these conventions is the best way to ensure users understand how different navigation systems relate to each other. Additionally, user-centered menu headings and features like bread-crumbs can help users avoid getting lost in complex hierarchies. In the case of LibGuides and links to external websites, designers should make efforts to ensure users can easily retrace their steps to main library pages, such as by ensuring links don’t automatically open in a new window and including links back to main pages where possible.

In addition to implications for content and design, discussions with users also focused on key website fea-tures like search and help. Users often sought single search boxes to span physical and digital government col-lections. Examples like the Government Publishing Office’s Metalib, and other discovery layer implementations in academic library settings, point to models for addressing problems with multiple portals and search tools.37 However, given potential confusion with the perception of government collections as a library-within-a-library, consideration should be paid to design and user experience of single search engines for specialized collections.

Users also had strong preferences for the design of contact and help features. Users indicated they preferred self-guided help options and viewed in-person help as an option of last resort, a theme which was echoed in our jargon analysis, and also found by Benedetti.38 However, many also mentioned they would seek one-on-one help if needed and provided insight into ways to reduce friction for these services. Librarian profiles and direct email addresses, as opposed to generic contact options, were perceived as friendlier and more “human,” pointing to-ward easy ways to encourage help-seeking and reduce potential anxiety surrounding such features. Participants in our study valued their independence and clearly preferred chat for its speed and low interaction costs; other users may have a lower tolerance for self-guided solutions or prefer different methods for getting help. While our findings point to ways to improve the help experience, questions remain regarding whether users know when they need more in-depth help and at what point self-guided solutions may be inadequate.

Academic Libraries, Government Information, and the Persistent Problem of Jargon

APRIL 10–13, 2019 • CLEVELAND, OHIO

899

LimitationsWhile our study ultimately revealed important trends in users’ understanding of government information and preferences for website design, there were some limitations to our approach. First, while four individuals partici-pated in our usability testing, we did not reach the recommended 5 participants considered ideal for user test-ing.39 Understanding that testing more participants would have strengthened the results and enabled researchers to make stronger conclusions, the research team opted to use this as a starting point and plan to conduct more user tests as the new website is developed.

In retrospect, some definitions in our jargon analysis could have been improved. In particular, more than one definition provided for terms like hearing and primary source could have been considered correct, depend-ing on usage. While the questionnaire made it clear that the terms were being used in a library or government information context, more effort could have been made to make this context clear for each question. Despite these limitations, alternative definitions needed to seem plausible in order to avoid making it too easy for par-ticipants to guess the right answer. Additionally, because many technical terms used in libraries and government information can be easily confused with other common usages, we feel the definitions we chose were appropriate and reflected the challenges users encounter in real-world situations.

Conclusion Through testing, we uncovered users’ strategies when they encountered new or confusing terms, and identified appropriate terminology and instructional design approaches to complement these strategies. Participants indi-cated that unfamiliar technical language can be challenging, but they prefer to work through problems on their own before reaching out for help from library staff. Participants commonly relied on the meaning of terms in other contexts to interpret unfamiliar terms, findings that will guide the design of a new Government Informa-tion Department website for USU Libraries. More broadly, findings from our studies suggest that more could be done to improve the usability of complex library collections.

Beyond government information, other specialized areas such as music, business, and archives would ben-efit from acknowledging the inherent complexity of their collections and designing user experiences that inte-grate support and instruction at various levels and in multiple modalities. Jargon has proven to be a persistent problem in libraries, but may represent just the tip of the iceberg in terms of the complexities that users face. To help users succeed in complex information-seeking tasks, librarians must confront the information and instruc-tional design problems posed by specialized library and disciplinary domains. However, with careful design and attention to users’ needs, these complexities can be overcome and users can be empowered as they transition from novices to expert information consumers.

Jennifer Kirk, Alex Sundt, and Teagan Eastman

ACRL 2019 • RECASTING THE NARRATIVE

900

Endnotes1. Rachael Naismith and Joan Stein, “Library Jargon: Student Comprehension of Technical Language Used by Librar-

ians,” College and Research Libraries, (September 1989): 551.2. Abdus Sattar Chaudhry and Meng Choo, “Understanding of Library Jargon in the Information Seeking Process,” Journal of Infor-

mation Science 27, no. 5 (October 2001): 347, https://doi.org/10.1177/016555150102700505. 3. Norman B. Hutcherson, “Library Jargon: Student Recognition of Terms and Concepts Commonly Used by Librarians in the Class-

room,” College and Research Libraries 65, no. 4 (July 2004): 352, https://doi.org/10.5860/crl.65.4.349. 4. Naismith and Stein, “Library Jargon,” 550-551.5. Chaudry and Choo, “Library Jargon,” 347-349.6. Hutcherson, Library Jargon,” 351.7. John Kupersmith, “Library Terms that Users Understand,” UC Berkeley: Library, (2012): 3, https://escholarship.org/uc/

item/3qq499w7. 8. Kupersmith, “Library Terms,” 3.9. Naismith and Stein, “Library Jargon,” 551.10. Hutcherson, “Library Jargon,” 354.11. Chaudry and Choo, “Library Jargon,” 348.12. Hutcherson, “Library Jargon,” 354.13. Naismith and Stein, “Library Jargon,” 551.14. Chaudry and Choo, “Library Jargon,” 348.15. Allison R. Benedetti, “Promoting Library Services with User-Centered Language,” portal: Libraries and the Academy 17, no. 2

(2017): 226, https://doi.org/10.1353/pla.2017.0013. 16. Duncan M. Aldrich, Gary Cornwell, and Daniel Barkley, “Changing Partnerships? Government Documents Departments

at the Turn of the Millennium,” Government Information Quarterly 17, no. 3 (2000): 274, https://doi.org/10.1016/S0740-624X(00)00035-6.

17. Stephanie Ford, “Public Access to Electronic Federal Depository Information in Regional Depository Libraries,” Government Infor-mation Quarterly 14, no. 1 (1997): 60-61, https://doi.org/10.1016/S0740-624X(97)90051-4.

18. Jennie M. Burroughs, “What Users Want: Assessing Government Information Preferences to Drive Information Services,” Govern-ment Information Quarterly 26, no. 1 (January 2009): 212-213, https://doi.org/10.1016/j.giq.2008.06.003.

19. Ruth Dickstein and Vicki Mills, “Usability Testing at the University of Arizona Library: How to Let Users in on the Design,” Infor-mation Technology and Libraries 19, no. 3 (2000): 144-181.

20. Jennifer L. Ward and Steve Hiller, “Usability Testing, Interface Design, and Portals,” Journal of Library Administration 43, no. 1-2 (2005): 155-171.

21. Emily B. Paladino, Jacqueline C. Klentzin and Chloe P. Mills, “Card Sorting in an Online Environment: Key to Involving Online-Only Student Population in Usability Testing of an Academic Library Website?,” Journal of Library and Information Services in Distance Learning 11, no. 1-2 (2017): 37-49, https://doi.org/10.1080/1533290X.2016.1223967.

22. Danielle A. Becker and Lauren Yannotta, “Modeling a Library Web Site Redesign Process: Developing a User-Centered Web Site Through Usability Testing,” Information Technology and Libraries 32, no. 1 (2013): 6-22.

23. Zhao Huang and Morad Benyoucef, “Usability and Credibility of E-Government Websites,” Government Information Quarterly 3, no. 4 (2014): 584-595, https://doi.org/10.1016/j.giq.2014.07.002.

24. Natalie Greene Taylor, Paul T. Jaeger, Ursula Gorham, John Carlo Bertot, Ruth Lincoln and Elizabeth Larson, “The Circular Continuum of Agencies, Public Libraries, and Users: A Model of E-Government Practice,” Government Information Quarterly 31 (2014): S18-S25.

25. Renata Machova, Miloslav Hub and Martin Lnenicka, “Usability Evaluation of Open Data Portals: Evaluating Data Discoverability, Accessibility and Reusability from a Stakeholders’ Perspective,” Aslib Journal of Information Management 70, no. 3 (2018): 252-268, https://doi.org/10.1108/AJIM-02-2018-0026.

26. Erica Dowell, “Web Site Usability for Rare Book and Manuscript Libraries,” RBM: a Journal of Rare Books, Manuscripts, and Cul-tural Heritage 9, no. 2 (September 2008): 168-182, https://doi.org/10.5860/rbm.9.2.306.

27. Leslie F. Stebbins, “Glossary” in Student Guide to Research in the Digital Age: How to Locate and Evaluate Information Sources (Westport, CT: Libraries Unlimited, 2006), 179-185.

28. Diane L. Garner and Diane H. Smith, “Glossary” in The Complete Guide to Citing Government Information Resources: A Manual for Writers and Catalogers (Bethesda, MD: Congressional Information Services, Inc., 1993), 199-206.

29. Scott W. Young, “Introducing HEX UX: An Integrated Method for Generating, Analyzing, and Reporting User Experience Data, (Presentation, Designing for Digital 2017, Austin, TX, March 2017), https://scottwhyoung.com/talks/hex-ux-2017/.

30. Jakob Nielsen, Usability Engineering (San Francisco: Morgan Kaufman, 1993), 195.31. Naismith and Stein, “Library Jargon,” 548.32. Hutcherson, “Library Jargon,” 351.33. Chaudry and Choo, “Library Jargon,” 346.34. Naismith and Stein, “Library Jargon,” 548.35. Chaudry and Choo, “Library Jargon,” 346.

Academic Libraries, Government Information, and the Persistent Problem of Jargon

APRIL 10–13, 2019 • CLEVELAND, OHIO

901

BibliographyAldrich, Duncan M., Gary Cornwell, and Daniel Barkley. “Changing Partnerships? Government Documents Departments at the Turn

of the Millennium.” Government Information Quarterly 17, no. 3 (2000): 274. https://doi.org/10.1016/S0740-624X(00)00035-6. Becker, Danielle A. and Lauren Yannotta. “Modeling a Library Web Site Redesign Process: Developing a User-Centered Web Site

Through Usability Testing.” Information Technology and Libraries 32, no. 1 (2013): 6-22.Benedetti, Alison, R. “Promoting Library Services with User-Centered Language.” portal: Libraries and the Academy 17, no. 2 (2017):

217-234. https://doi.org/10.1353/pla.2017.0013. Burroughs, Jennie M. “What users want: Assessing government information preferences to drive information services.” Government

Information Quarterly 26, no. 1 (2009): 203-218.Chaudhry, Abdus Sattar, and Meng Choo. “Understanding of Library Jargon in the Information Seeking Process.” Journal of Informa-

tion Science 27, no. 5 (October 2001): 343–49. https://doi.org/10.1177/016555150102700505.Cobus, Laura, Valeda Frances Dent, and Anita Ondrusek. “How Twenty-Eight Users Helped Redesign an Academic Library Web Site:

A Usability Study.” Reference & User Services Quarterly 44, no. 3 (2005): 232-46.Dickstein, Ruth, and Victoria A. Mills. “Usability Testing at the University of Arizona Library: How to Let the Users in on the De-

sign.” Information Technology & Libraries 19, no. 3 (September 2000): 144–51. Dowell, Erica. “Web Site Usability for Rare Book and Manuscript Libraries.” RBM: a Journal of Rare Books, Manuscripts, and Cultural

Heritage 9, no. 2 (September 2008): 168-182. https://doi.org/10.5860/rbm.9.2.306. Ford, Stephanie. “Public Access to Electronic Federal Depository Information in Regional Depository Libraries.” Government Informa-

tion Quarterly 14, no. 1 (1997): 51-63. https://doi.org/10.1016/S0740-624X(97)90051-4. Garner, Diane L. and Diane H. Smith. The Complete Guide to Citing Government Information Resources: A Manual for Writers and Cata-

logers (Bethesda, MD: Congressional Information Services, Inc., 1993). Huang, Zhao and Morad Benyoucef. “Usability and credibility of e-government websites.” Government Information Quarterly 31, no. 4

(2014): 584-595. https://doi.org/10.1016/j.giq.2014.07.002. Hutcherson, Norman B. “Library Jargon: Student Recognition of Terms and Concepts Commonly Used by Librarians in the Class-

room.” College and Research Libraries 65, no. 4 (July 2004): 349-354. https://doi.org/10.5860/crl.65.4.349.Kupersmith, John. “Library Terms That Users Understand.” UC Berkeley: Library, (February 2012): 1-37.Machova, Renata, Miloslav Hub and Martin Lnenicka. “Usability Evaluation of Open Data Portals: Evaluating Data Discoverability,

Accessibility and Reusability from a Stakeholders’ Perspective,” Aslib Journal of Information Management 70, no. 3 (2018): 252-268. https://doi.org/10.1108/AJIM-02-2018-0026.

Naismith, Rachael, and Joan Stein. “Library jargon: student comprehension of technical language used by librarians.” College and Re-search Libraries, (September 1989): 543-552.

Nielsen, Jakob. “Why You Only Need to Test with 5 Users.” Nielsen Norman Group, published March 19, 2000. https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/

Nielsen, Jakob. Usability Engineering. San Francisco: Morgan Kaufman, 1993.Paladino, Emily B., Jacqueline C. Klentzin, and Chloe P. Mills. “Card Sorting in an Online Environment: Key to Involving Online-Only

Student Population in Usability Testing of an Academic Library Web Site?.” Journal of Library & Information Services in Distance Learning 11, no. 1-2 (2017): 37-49. https://doi.org/10.1080/1533290X.2016.1223967.

Stebbins, Leslie F. Student Guide to Research in the Digital Age: How to Locate and Evaluate Information Sources. Westport, CT: Libraries Unlimited, 2006.

Taylor, Natalie Greene, Paul T. Jaeger, Ursula Gorham, John Carlo Bertot, Ruth Lincoln, and Elizabeth Larson. “The circular continuum of agencies, public libraries, and users: A model of e-government in practice.” Government Information Quarterly 31 (2014): S18-S25.

Ward, Jennifer L., and Steve Hiller. “Usability testing, interface design, and portals.” Journal of Library Administration 43, no. 1-2 (2005): 155-171.

Young, Scott W. “Introducing HEX UX: An Integrated Method for Generating, Analyzing, and Reporting User Experience Data.” Presentation at the 2017 Designing for Digital conference, Austin, TX, March 2017. https://scottwhyoung.com/talks/hex-ux-2017/.

36. Laura Cobus Valeda Frances Dent, and Anita Ondrusek, “How Twenty-Eight Users Helped Redesign an Academic Library Web Site: A Usability Study,” Reference & User Services Quarterly 44, no. 3 (2005): 232-46.

37. Burroughs, “What Users Want,” 212-213.38. Benedetti, “Promoting Library Services with User-Centered Language,” 226.39. Jakob Nielsen, “Why You Only Need to Test with 5 Users,” Nielsen Norman Group, published March 19, 2000, https://www.

nngroup.com/articles/why-you-only-need-to-test-with-5-users/.

Jennifer Kirk, Alex Sundt, and Teagan Eastman

ACRL 2019 • RECASTING THE NARRATIVE

902