Beyond search queries

Preview:

Citation preview

Beyond search queriesJano Suchalsearchd.co

Searchas seen by developers

{ "query": { "query_string": { "query": "elasticsearch book" } }}

return response.hits.hits

Searchas experienced by users

query: elasticsarchTypo in query. No results.

query: elasticsearch Too many hits. Not relevant.

query: elasticsearch bookClick!Success! Or?

Measuring search quality

Cpt. Obvious: “Hits, clicks and order

do matter.”

Accurately interpreting clickthrough data as implicit feedback

Thorsten Joachims, Laura Granka, Bing Pan, Helene Hembrooke, and Geri Gay. Accurately interpreting clickthrough data as implicit feedback. In Proceedings of the 28th annual international ACM SIGIR conference on Research and development in Information retrieval, SIGIR ’05, pages 154–161, New York, NY, USA, 2005. ACM.

Accurately interpreting clickthrough data as implicit feedback

Search quality metrics

● Mean Average Precision @ N○ probability of target result being in top N items

● Mean Reciprocal Rank○ 1 / rank of target result

● Normalized Discounted Cumulative Gain● Expected Reciprocal Rank

Search KPIs

● CTR trend

● # of queries w/o results or clicks

● # of searches per session

● Search engine latency

Search quality optimization

Optimizing search engines using clickthrough data

Thorsten Joachims. Optimizing search engines using clickthrough data. In Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining, KDD ’02, pages 133–142, New York, NY, USA, 2002. ACM.

Optimizing search engines using clickthrough data

Query chains: learning to rank from implicit feedback

Filip Radlinski and Thorsten Joachims. Query chains: learning to rank from implicit feedback. In KDD ’05: Proceeding of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining, pages 239–248, New York, NY, USA, 2005. ACM.

Fighting Search Engine Amnesia: Reranking Repeated Results

Milad Shokouhi, Ryen W. White, Paul Bennett, and Filip Radlinski. Fighting search engine amnesia: reranking repeated results. In Proceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval, SIGIR ’13, pages 273–282, New York, NY, USA, 2013. ACM.

In this paper, we observed that the same results are often shown to users multiple times during search sessions. We showed that there are a number of effects at play, which can be leveraged to improve information retrieval performance. In particular, previously skipped results are much less likely to be clicked, and previously clicked results may or may not be re-clicked depending on other factors of the session.

searchd.coSearch Analytics

searchd.co dashboard

A/B testing

A/B testing lists

A/B testing

A B

A/B testing with interleaving

A B

Interleaving & scoring

● Balanced● Team Draft● Probabilistic

● Binary preference● Linear rank difference● Inverse rank difference

A/B testing with interleaving

A/B testing with interleaving

A/B testing with interleaving

● Pros○ Lower risk of loosing conversions

● Cons○ Harder to interpret○ Harder to implement

searchd.coSearch Analytics

● Identify and fix key search problems ● KPIs for site search● Actionable tips for search tuning● Safe A/B testing

● Easy setup

● In Beta, sending out invites

Bad search experience is a lost opportunity. Let's fix it.

searchd.coSearch Analytics

www.searchd.coinfo@searchd.co

Recommended