View
487
Download
0
Category
Tags:
Preview:
DESCRIPTION
Citation preview
Intranet Search #fail
#SPSBE07Ben van Mol, VentigrateApril 26th, 2014
Thanks to our sponsors!
Gold
Silver
SharePoint Team lead & Presales
@vanmobe
ben.vanmol@ventigrate.befacebook.com/ventigrate
@ventigrate
linkedin.com/company/ventigrate
info@ventigrate.be
Veldkant 33ABE-2550 Kontich
TEL: +32 (0)3 450 80 30FAX: +32 (0)3 450 80 39
Who am I?
Click to insert photo.
Agenda
Why Enterprise search fails
Search Experience factors
Where to start?
The search feature jungle
Analysis & improvement
Agenda
Questions
Why Enterprise Search fails
Thanks to Google (and Bing of course)• Search technology is wide-spread • Search is highly adopted• Search is conceived as an easy tool to explore and find relevant
information
So…
search engines are installed, the indexing engines start indexing the content and everybody is feeling lucky!
Or is it not that simple?
All I want is
How complicated can it be?!
False expectations
Enterprise data is complexThe information needs within an organization span a wide variety of information types, sources, formats, ...
Popularity and the number of referrals are less important in Enterprise search compared to Internet Search.
Google builds its metadata from millions of users searching for content, the enterprise is a much smaller case.
In an Enterprise lot’s of people create content with little attention paid to information governance.
The user is complex“This is a huge change to the overall user experience. It transforms the way we think and opens opportunities to use search in a disruptive fashion. I love it!”
“Personally, I think people will get annoyed with it. The interface itself isn’t anything new, and it’s an outdated concept. When you think about state-of-the-art search, it should be less about searching and more about finding.”
Search Experience factors
Expertise
How would you take a picture?
Expertise significantly impacts how we seek information online.
The effects on search are determined by
• Domain knowledge• Technical knowledge
http://bit.ly/1pQe5dv
The UserHow would you take a picture?
User expertise
Novices orienteer, experts teleport
Click to insert photo.
Search Experience factors
Cognitive Styles
Serialists versus HolistsSerialists concentrate on the individual parts rather than the whole
Holists focus on the cohesive whole rather than on components
Draw a vertical line inside the rectangleRod-and-Frame test (Witkens & Ash)
Serialists versus holists
- Spend 50% more time- Visit twice as many pages- Are more likely to use the browser’s
back button
BUT: the performance gap vanishes if technical expertise is equally high
SerialistHolist
Source: Kim K. Information seeking on the web: Effects of user and task variables. Library & Information Science Research. 2001;23 233–255.6,8.
Verbal versus visual
30 percent rise in effective problem solving when both verbal and visual instructions were presented in conjunction
Source: Paivio A. Imagery and verbal processes New York: Holt: Rinehart and Winston; 1971.12.
Mayer R, Sims VK. For whom is a picture worth a thousand words? Extensions of a dual-coding theory of multimedia learning. Journal of Educational Psychology. 1994;86 389–401.11, 13
Search Experience factors
Information Seeking
Information foraging
Information retrieval is similar to how a bear searches food: Cost/benefit analysis of amount of effort needed
• Information scent triggers attention
• Trigger words in descriptions, titles, hyperlinks
• When information scent is rising clicking links we conclude we are heading to the right direction
• Between-patch time determines the amount of time spent on a page
The faster the internet connection, the less time we spend on a page!
Where to start?
Who needs search?• No, thanks…
Low on content Solve poor navigational design No time to optimize If users rather browse than search
• Yes please! Too much content to browse To tame dynamism To help fragmented sites
Different schools
25
Continuous improvementSearch is a continuous improvement process
• Small iterations with PDCA cycles
• Requires Management buy-in
• End-users involvement• Good communication• Means to contribute
Plan
Do
Act
Check
RecallDefinition: RECALL is the ratio of the number of relevant records retrieved to the total number of relevant records in the database. It is usually expressed as a percentage.
Quick Relevancy Test
Queries ideally find most reasonable rankings at position #1
PrecisionDefinition: PRECISION is the ratio of the number of relevant records retrieved to the total number of irrelevant and relevant records retrieved. It is usually expressed as a percentage.
Quick Precision Test Relevant (r) Near (n) Misplaced (m) Irrelevant (i)
Strict = r Loose = r+n Permissive = r + n + m
Search Performance Metrics
How to start?
Query log Analysis Help desk calls Diaries Focus Groups
Personas Team meetings Use Cases User Interviews User Feedback
Capture the user requirements using traditional analysis techniques or analyze the existing data to analyze search performance and behavior.
Work like Google is not a requirement!
Where to start?
Monitor your top queries. Nail them!
The search feature jungle
Novices
Backwards oriented behaviour
• Autosuggest – help express specific terms and suggest queries of other users
• Related searches – stimulate novices to explore related searches
• Avoid zero-results – by using spelling correction, query expansion, query reformulation
• Breadcrumbs – to navigate back to a previous query if one is unsuccessfull
Click to insert photo.
Experts
Deep-dive & faster decisions
• Advanced syntax – willing to learn advanced commands to get more control in return
• Filtering & sorting – selecting ranges, excluding terms, filter by format
• As-you-type results– enable users to skip search result screens
Close the gap
Designing search user interfaces that are easy to learn can help bridge the gap between novice and expert serialists, progressively training them how to use the application
Source: Spool, J. (2005). What makes design seem “intuitive”? User Interface Engineering. Retrieved June 8, 2012 from http://www.uie.com/articles/design_intuitive/.9
Design for learnability• Descriptive text in search
box• Contextual popovers• Guidance• Full-screen overlays
Design with overviews & previews• Maps
• Histograms
• Graphs
• Thumbnails
• Document Previews…
Click to insert photo.
Source: blog.comperiosearch.com
Design with information scent• Descriptive titles - reasonably long titles
work better than smaller ones
• Indicate the number of results
• As-you-type query suggestions
• Hit highlighting
• Clear labeling (faster filtering)
• Zero Result strategy
When information scent is strong, users are confident that they’re headed in the right direction. When it’s weak, users may be uncertain of what to next, or they may abandon their search altogether.
Analysis & Improvement
Pattern Analysis
Examples
• Tonal Patterns (swine flu <> h1n1)
• Synonym patterns (mail <> email)
• Time-based patterns (traffic @eod)
• Question patterns (categorization)
• Answer patterns (content types)
• Find common usage patterns, trends, and outliers
• Start with queries and their relative frequency counts
• Eliminate search log “junk”—meaningless queries—as best you can to improve your analysis.
Pattern Analysis
Examples
• Tonal Patterns (swine flu <> h1n1)
• Synonym patterns (mail <> email)
• Time-based patterns (traffic @eod)
• Question patterns (categorization)
• Answer patterns (content types)
Pattern Analysis
Examples
• Tonal Patterns (swine flu <> h1n1)
• Synonym patterns (mail <> email)
• Time-based patterns (traffic @eod)
• Question patterns (categorization)
• Answer patterns (content types)• Try to understand what people are
looking for based on the query cluster
• Works best when done by multiple people
Pattern Analysis
Examples
• Tonal Patterns (swine flu <> h1n1)
• Synonym patterns (mail <> email)
• Time-based patterns (traffic @eod)
• Question patterns (categorization)
• Answer patterns (content types)
Try to find what type of content users expect to find to identify potential
content types.
Failure Analysis • Queries that lead to zero results usually suggest that
• You aren’t offering the content that your searchers want.
• You offer it, but the search engine isn’t finding it.
• A difference exists between how you and your searchers describe the same content.
• Queries that fail to retrieve useful results or lead to immediate exits from your site are other decent indicators of failure.
Diagnose problems and determine what to fix or improve for your site’s searchers.
Zero result strategy Example: http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=nike%20sneakers%20ruskie• Don’t be afraid to say you did not
understand to prevent trashing (changing the query without resolving the problem)
• Focus on providing a way out. Make sure every control on the page does something productive to help resolve the no search results condition.
• Focus on the customer’s goal. Provide the most relevant recovery content first, while staying as close as possible to the customer’s original intent.
Session Analysis The best sessions to focus on include
• Sessions that start and end with your most popular queries.
• Sessions that end in failure (to fix them).
• Sessions with specialized queries that are especially important to your business (such as product names).
If you have access to information about who searched what and when on your site, conducting session analysis will help you gain deeper insight into what searchers do and how their needs change over a short period of time.
Audience Analysis • Consider performing pattern analysis, session analysis, failure analysis, and goal-based analysis on each segment.
• See what they share in common and how they differ.
• Insights can help define what labels to use for various audiences, or types of content, to show or prioritize by role
Audience analysis will help you better understand how information needs and searching experiences differ between audience segments.
Challenge the assumption that your users are all alike.
Audience analysis can beef up your personas or boost your organization’s existing segmentation analysis.
Measure User Satisfaction Climate Surveys User Surveys User feedback forms
Thank you!
Bibliography • Designing the search experience, Tony Russell-Rote & Tyler Tate, Elsevier
• Search Analytics for your site, Louis Rosenfeld, Rosenfeld Media
• Enterprise Search, Martin White, O’Reilly Media
• The Answer Machine, Sue E. Feldman, Morgan & Claypool Publishers
Recommended