Confessions of an Uber-Optimiser
5th Sep 2013 @OptimiseOrDie
@OptimiseOrDie
@OptimiseOrDie
@OptimiseOrDie
Timeline
- 1998 1999 - 2004
2004-2008 2008-2012
Belron Brands
@OptimiseOrDie
SEO
@OptimiseOrDie
PPC UX Analytics
A/B and Multivariate testing
Customer Satisfaction
Design
QADevelopment
40+ websites, 34 countries, 19 languages, âŹ1bn+ revenue
Performance
8 people
@OptimiseOrDie
Ahh, how it hurt
If youâre not a part of the solution, thereâs good money to be made in prolonging the
problem
Out of my comfort zoneâŚ
@OptimiseOrDie
Behind enemy linesâŚ
@OptimiseOrDie
Nice day at the office, dear?
@OptimiseOrDie
CompetitionâŚ
Traffic is harder!
SEO/PPC
Panguin toolâŚ
Casino Psychology
If it isnât working, youâre not doing it right
@OptimiseOrDie
#1 : Your analytics is cattle trucked
@OptimiseOrDie
#1 : Your analytics is cattle trucked
@OptimiseOrDie
#1 : Common problems (GA)⢠Dual purpose goal page
â One page used by two outcomes â and not split
⢠Cross domain trackingâ Where you jump between sites, this borks the data
⢠Filters not correctly set upâ Your office, agencies, developers are skewing data
⢠Code missing or double codeâ Causes visit splitting, double pageviews, skews bounce rate
⢠Campaign, Social, Email tracking etc.â External links you generate are not setup to record properly
⢠Errors not tracked (404, 5xx, Other)â You are unaware of error volumes, locations and impact
⢠Dual flow funnelsâ Flows join in the middle of a funnel or loop internally
⢠Event tracking skews bounce rateâ If an event is set to be âinteractiveâ â it can skew bounce rate
(example)
@OptimiseOrDie
20
#1 : Common problems (GA)â EXAMPLE
Landing 1st interaction Loss 2nd
interaction Loss 3rd interaction Loss 4th
interaction Loss
55900 527 99.1% 66 87.5% 55 16.7% 33 40.0%
30900 4120 86.7% 2470 40.0% 1680 32.0% 1240 26.2%
#1 : Solutions⢠Get a Health Check for your Analytics
â Try @prwd, @danbarker, @peter_oneill or ask me!
⢠Invest continually in instrumentationâ Aim for at least 5% of dev time to fix +
improve⢠Stop shrugging : plug your insight gaps
â Change âI donât knowâ to âIâll find outâ⢠Look at event tracking (Google Analytics)
â If set up correctly, you get wonderful insights⢠Would you use paper instead of a till?
â You wouldnât do it in retail so stop doing it online!
⢠How do you win F1 races?â With the wrong performance data, you wonât
@OptimiseOrDie
Insight - Inputs
#FAILCompetitor copying
Guessing
Dice rolling
An article the CEO read
Competitor change
PanicEgo
OpinionCherished notionsMarketing whimsCosmic raysNot âon brandâ enough
IT inflexibility
Internal company needs
Some dumbass consultant
Shiny feature blindness
Knee jerk reactons
#2 : Your inputs are all wrong
@OptimiseOrDie
Insight - Inputs
InsightSegmentation
Surveys
Sales and Call Centre
Session Replay
Social analytics
Customer contactEye tracking
Usability testingForms analyticsSearch analyticsVoice of Customer
Market research
A/B and MVT testing
Big & unstructured data
Web analytics
Competitor evals
Customer services
#2 : Your inputs are all wrong
@OptimiseOrDie
#2 : Solutions⢠Usability testing and User Centred design
â If youâre not doing this properly, youâre hosed
⢠Champion UX+ - with added numbersâ (Re)designing without inputs + numbers is guessing
⢠You need one team on this, not silosâ Stop handing round the baby (Iâll come back to this)
⢠Ego, Opinion, Cherished notions â fill gapsâ Fill these vacuums with insights and data
⢠Champion the usersâ Someone needs to take their side!
⢠You need multiple tool inputsâ Let me show you my core list
@OptimiseOrDie
#2 : Core tools⢠Properly set up analytics
â Without this foundation, youâre toast⢠Session replay tools
â Clicktale, Tealeaf, Sessioncam and moreâŚâ˘ Cheap / Crowdsourced usability testing
â See the resource pack for more details⢠Voice of Customer / Feedback tools
â 4Q, Kampyle, Qualaroo, Usabilla and more⌠⢠A/B and Multivariate testing
â Optimizely, Google Content Experiments, VWO⢠Email, Browser and Mobile testing
â You donât know if it works unless you check
@OptimiseOrDie
#3 : Youâre not testing (enough)
@OptimiseOrDie
#3 : Common problems⢠Letâs take a quick poll
â How many tests do you complete a month?
⢠Not enough resource â You MUST hire, invest and ringfence time and staff for CRO
⢠Testing has gone to sleepâ Some vendors have a ârescueâ team for these accounts
⢠Vanity testing takes holdâ Getting one test done a quarter? Still showing it a year later?
⢠You keep testing without buyin at C-Levelâ If nobody sees the flower, was it there?
⢠You havenât got a process â just a pluginâ Insight, Brainstorm, Wireframe, Design, Build, QA test,
Monitor, Analyse. Tools, Process, People, Time -> INVEST
⢠IT or release barriers slow down workâ Circumvent with tagging toolsâ Develop ways around the innovation barrier
@OptimiseOrDie
#4 : Not executing fast enough
@OptimiseOrDie
#4 : Not executing fast enough
⢠Silo Mentality means pass the productâ No âone teamâ approach means no âone productâ
⢠The process is badly designedâ See the resource pack or ask me later!
⢠People mistake hypotheses for finalsâ Endless argument, tweaking means NO TESTING â let
the test decide, please!
⢠No clarity : authority or decision makingâ You need a strong leader to get things decided
⢠Signoff takes far too longâ Signoff by committee is a velocity killer â the CUSTOMER
and the NUMBERS are the signoff
⢠You set your target too lowâ Aim for a high target and keep increasing it @OptimiseOrD
ie
CRO
@OptimiseOrDie
#4 : Execution solutions⢠Agile, One Team approach
â Everyone works on the lifecycle, together
⢠Hire Polymathsâ T-shaped or just multi-skilled, I hire them a lot
⢠Use Collaborative Tools, not meetingsâ See the resource pack
⢠Market the resultsâ Market this stuff internally like a PR agency â Encourage betting in the office
⢠Smash down silos â a special missionâ Involve the worst offenders in the hypothesis teamâ âHold your friends close, and your enemies closerââ Work WITH the developers to find solutionsâ Ask Developers and IT for solutions, not apologies
@OptimiseOrDie
#5 : Product cycles are too long
0 6 12 18
Months
Conversion
@OptimiseOrDie
#5 : Solutions⢠Give Priority Boarding for opportunities
â The best seats reserved for metric shifters
⢠Release more often to close the gapâ More testing resource helps, analytics âhawk eyeâ
⢠Kaizen â continuous improvementâ Others call it JFDI (just f***ing do it)
⢠Make changes AS WELL as tests, basically!â These small things add up
⢠RUSH Hair booking â Over 100 changesâ No functional changes at all â 37% improvement
⢠Inbetween product lifecycles?â The added lift for 10 days work, worth 360k
@OptimiseOrDie
#5 : Make your own cyclesâRather than try and improve one thing by 10% - which would be very, very difficult to do,
We go and find 1,000 things and improve them all by a fraction of a per cent, which is totally do-able.âChris Boardman
@OptimiseOrDie
#6 â No Photo UX
24 Jan 2012
⢠Persuasion / Influence / Direction / Explanation
⢠Helps people process information and stories
⢠Vital to sell an âexperienceâ
⢠Helps people recognise and discriminate between things
⢠Supports Scanning Visitors
⢠Drives emotional response
short.cx/YrBczl
⢠Very powerful and under-estimated area
⢠Iâve done over 20M visitor tests with people images for a service industry â some tips:
⢠The person, pose, eye gaze, facial expressions and body language â cause visceral emotional reactions and big changes in behaviour
⢠Eye gaze crucial â to engage you or to âpointâ
Photo UX
24 Jan 2012
⢠Negative body language is a turnoff ⢠Uniforms and branding a positive (ball
cap) ⢠Hands are hard to handle â use a prop to
help⢠For Ecommerce â tip! test bigger images!⢠Autoglass and Belron always use real
people⢠In most countries (out of 33) with strong
female and male images in test, female wins
⢠Smile and authenticity in these examples is absolutely vital
⢠So, I have a question for you
Photo UX
@OptimiseOrDie
+13.9%
@OptimiseOrDie
+5.9%
Terrible Stock Photos : headsethotties.com & awkwardstockphotos.comLaughing at Salads : womenlaughingwithsalad.tumblr.com
BBC Fake Smile Test : bbc.in/5rtnv@OptimiseOrD
ie
SPAIN
+22% over control
99% confidence
âItâs not about what you think when you look at the design â itâs about the reaction it causes in the mind of the viewer. Always design for that first.â
@OptimiseOrDie
@OptimiseOrDie
#7 : Your tests are cattle trucked⢠Many tests fail due to QA or browser
bugsâ Always do cross browser QA testing â see resources
⢠Donât rely on developers saying âyesââ Use your analytics to define the list to test
⢠Cross instrument your analyticsâ You need this to check the test software works
⢠Store the variant(s) seen in analyticsâ Compare people who saw A/B/A vs. A/B/B
⢠Segment your data to find variancesâ Failed tests usually show differences for segments
⢠Watch the test and analytics CLOSELYâ After you go live, religiously check bothâ Read this article : stanford.io/15UYov0 @OptimiseOrD
ie
#8 : Stats are confusing
⢠Many testers & marketing people struggleâ How long will it take to run the test?â Is the test ready?â How long should I keep it running for?â It says itâs ready after 3 days â is it?â Can we close it now â the numbers look great!
⢠A/B testing maths for dummies:â http://bit.ly/15UXLS4
⢠For more advanced testers:â Read this : http://bit.ly/1a4iJ1H
⢠Iâm going to build a stats courseâ To explain all the common questionsâ To save me having to explain this crap all the time
@OptimiseOrDie
#9 : Youâre not segmenting
⢠Averages lieâ What about new vs. returning visitors?â What about different keyword groups?â Landing pages? Routes? Attributes
⢠Failed tests are just âaveraged outââ You must look at segment level dataâ You must integrate the analytics + a/b test software
⢠The downside?â Youâll need more test data â to segment
⢠The upside?â Helps figure out why test didnât performâ Finds value in failed or âno differenceâ testsâ Drives further testing focus
@OptimiseOrDie
#10 : Youâre unichannel optimising
⢠Not using call trackingâ Look at Infinity Tracking (UK)â Get Google keyword level call volumes!
⢠You donât measure channel switchersâ People who bail a funnel and callâ People who use chat or other contact/sales
⢠You âforgetâ mobile & tablet journeysâ Walk the path from search -> ppc/seo -> siteâ Optimise for all your device mix & journeys
⢠Youâre responsiveâ Testing may now bleed across device platformsâ Changing in one place may impact many othersâ QA, Device and Browser testing even more vital
@OptimiseOrDie
SUMMARY : The best CompaniesâŚ.
⢠Invest continually in Analytics instrumentation, tools & people⢠Use an Agile, iterative, Cross-silo, One team project culture⢠Prefer collaborative tools to having lots of meetings⢠Prioritise development based on numbers and insight⢠Practice real continuous product improvement, not SLED⢠Source photos and copy that support persuasion and utility⢠Have cross channel, cross device design, testing and QA⢠Segment their data for valuable insights, every test or change⢠Continually try to reduce cycle (iteration) time in their process⢠Blend âlongâ design, continuous improvement AND split tests⢠Make optimisation the engine of change, not the slave of ego⢠See the Maturity Model in the resource pack
@OptimiseOrDie
So you want examples?
⢠Belron â Ed Colley⢠Dell â Nazli Yuzak⢠Shop Direct â Paul Postance (now with EE)⢠Expedia â Oliver Paton⢠Schuh â Stuart McMillan⢠Soundcloud â Eleftherios Diakomichalis & Ole
Bahlmann⢠Gov.uk â Adam Bailin (now with the BBC)
Read the gov.uk principles : www.gov.uk/designprinciples
And my personal favourite of 2013 â Airbnb!
@OptimiseOrDie
⢠Youâre in the right place today!⢠This work is rarely easy â it always involves
doing LOTS of things, not just one test a quarter
⢠Invest in people, tools, analytics, techniques but most of all a process and strategy â cool tools are not enough
⢠Stop putting things in the next release (JFDI)⢠This is not a bolt-on â it IS the process⢠Donât be afraid to fail â youâre learning⢠Be Brave, Be Bold and most importantly⢠Never ever ever EVER give up⢠Enjoy the wonderful lineupâŚ.
Donât panic!
49
Is there a way to fix this then?Conversion Heroes!
@OptimiseOrDie
50
:@OptimiseOrDie
:linkd.in/pvrg14
More reading. Download the slides! QuestionsâŚ
51
RESOURCE PACK
52
RESOURCE PACK
⢠Maturity model⢠Crowdsourced UX⢠Collaborative tools⢠Testing tools for CRO & QA⢠Belron methodology example⢠CRO and testing resources
53
Ad Hoc
Local HeroesChaotic Good
Level 1Starter Level
GuessingA/B testingBasic tools
AnalyticsSurveys
Contact CentreLow budget
usability
Outline process
Small teamLow hanging
fruit
+ Multi variateSession replayNo segments
+Regular usability
testing/research
PrototypingSession replay
Onsite feedback
_____________________________________________________________________________________________ _
Dedicated team
Volume opportunities
Cross silo teamSystematic
tests
Ninja TeamTesting in the
DNA
Well developed Streamlined Company wide
+Funnel optimisationCall tracking
Some segments Micro testing
Bounce ratesBig volume
landing pages
+ Funnel analysis
Low converting & High loss
pages
+ offline integration
Single channel picture
+ Funnel fixesForms analytics
Channel switches
+Cross channel testing
Integrated CRO and analyticsSegmentation
+Spread tool use
Dynamic adaptive targetingMachine learningRealtime
Multichannel funnels
Cross channel synergy
_______________________________________________________________________________________________
________________________________________________________________________________________________
Testing
focus
Culture
Process
Analytics
focus
Insightmethod
s
+User Centered DesignLayered feedback
Mini product tests
Get buyin
________________________________________________________________________________________________Missio
nProve ROI Scale the
testing Mine valueContinual
improvement
+ Customer sat scores tied to
UXRapid iterative
testing and design
+ All channel view of
customerDriving offline using online
All promotion driven by testing
Level 2Early maturity
Level 3Serious testing
Level 4Core business value
Level 5You rock, awesomely
________________________________________________________________________________________________
54
Som, feedbackRemote UX tools (P=Panel, S=Site recruited, B=Both)Usertesting (B) www.usertesting.comUserlytics (B) www.userlytics.comUserzoom (S) www.userzoom.comIntuition HQ (S) www.intuitionhq.comMechanical turk (S) www.mechanicalturk.comLoop11 (S) www.loop11.comOpen Hallway (S) www.openhallway.comWhat Users Do (P) www.whatusersdo.comFeedback army (P) www.feedbackarmy.comUser feel (P) www.userfeel.comEthnio (For Recruiting) www.ethnio.com
Feedback on Prototypes / MockupsPidoco www.pidoco.comVerify from Zurb www.verifyapp.comFive second test www.fivesecondtest.comConceptshare www.conceptshare.comUsabilla www.usabilla.com
2 - UX Crowd tools
55
3 - Collaborative Tools
Oh sh*t
56
3.1 - Join.me
57
3.2 - Pivotal Tracker
58
3.3 â Trello
59
3.4 - Basecamp
60
⢠Lots of people donât know this⢠Serious time is getting wasted on pulling and preparing data⢠Use the Google API to roll your own reports straight into Big G⢠Google Analytics + API + Google docs integration = A BETTER LIFE!⢠Hack your way to having more productive weeks⢠Learn how to do this to make completely custom reports
3.5 - Google Docs and Automation
61
⢠LucidChart
3.6 - Cloud Collaboration
62
⢠Webnotes
3.7 - Cloud Collaboration
63
⢠Protonotes
3.8 - Cloud Collaboration
64
⢠Conceptshare
3.9 - Cloud Collaboration
65
4 â QA and Testing tools
Email testing www.litmus.comwww.returnpath.comwww.lyris.com
Browser testing www.crossbrowsertesting.comwww.cloudtesting.comwww.multibrowserviewer.comwww.saucelabs.com
Mobile devices www.perfectomobile.comwww.deviceanywhere.comwww.mobilexweb.com/emulatorswww.opendevicelab.com
66
5 â MĂŠthodologies - Lean UX
Positiveâ Lightweight and very fast methodsâ Realtime or rapid improvementsâ Documentation light, value highâ Low on wastage and fripperyâ Fast time to market, then optimiseâ Allows you to pivot into new areas
Negativeâ Often needs user test feedback to
steer the development, as data not enough
â Bosses distrust stuff where the outcome isnât known
âThe application of UX design methods into product development, tailored to fit Build-Measure-Learn cycles.â
67
5 - Agile UX / UCD / Collaborative Design
Positiveâ User centricâ Goals met substantiallyâ Rapid time to market (especially when
using Agile iterations)
Negativeâ Without quant data, user goals can
drive the show â missing the business sweet spot
â Some people find it hard to integrate with siloed teams
â Doesnâtâ work with waterfall IMHO
Wireframe
Prototype
TestAnalyse
Concept
Research
âAn integration of User Experience Design and Agile* Software Development Methodologiesâ
*Sometimes
68
CRO
69
5 - Lean Conversion Optimisation
Positiveâ A blend of several techniquesâ Multiple sources of Qual and Quant data aids triangulationâ CRO analytics focus drives unearned value inside all
products
Negativeâ Needs a one team approach with a strong PM who is a
Polymath (Commercial, Analytics, UX, Technical)â Only works if your teams can take the pace â you might be
surprised though!
âA blend of User Experience Design, Agile PM, Rapid Lean UX Build-Measure-Learn cycles, triangulated data sources, triage and prioritisation.â
70
5 - Lean CROInspection
Immersion
Identify
Triage & Triangulate
Outcome Streams
Measure
Learn
Instrument
71
5 - Triage and Triangulation
⢠Starts with the analytics data⢠Then UX and user journey walkthrough from SERPS -> key paths⢠Then back to analytics data for a whole range of reports:⢠Segmented reporting, Traffic sources, Device viewport and
browser, Platform (tablet, mobile, desktop) and many more⢠We use other tools or insight sources to help form hypotheses⢠We triangulate with other data where possible⢠We estimate the potential uplift of fixing/improving something
as well as the difficulty (time/resource/complexity/risk)⢠A simple quadrant shows the value clusters⢠We then WORK the highest and easiest scores byâŚâ˘ Turning every opportunity spotted into an OUTCOME
âThis is where the smarts of CRO are â in identifying the easiest stuff to test or fix that will drive the largest uplift.â
72
5 - The Bucket MethodologyâHelps you to stream actions from the insights and prioritisation work. Forces an action for every issue, a counter for every opportunity being lost.â
Test If there is an obvious opportunity to shift behaviour, expose insight or increase conversion â this bucket is where you place stuff for testing. If
you have traffic and leakage, this is the bucket for that issue.
InstrumentIf an issue is placed in this bucket, it means we need to beef up the
analytics reporting. This can involve fixing, adding or improving tag or event handling on the analytics configuration. We instrument both
structurally and for insight in the pain points weâve found.
Hypothesise This is where weâve found a page, widget or process thatâs just not working well but we donât see a clear single solution. Since we need to really shift the behaviour at this crux point, weâll brainstorm hypotheses. Driven by
evidence and data, weâll create test plans to find the answers to the questions and change the conversion or KPI figure in the desired direction.
Just Do It JFDI (Just Do It) â is a bucket for issues where a fix is easy to identify or the change is a no-brainer. Items marked with this flag can either be deployed in a batch or as part of a controlled test. Stuff in here requires low effort or are micro-opportunities to increase conversion and should be fixed.
Investigate You need to do some testing with particular devices or need more information to triangulate a problem you spotted. If an item is in this
bucket, you need to ask questions or do further digging.
5 - Belron example â Funnel replacementFinal
prototype
Usability issues
left
Final changes
Release build
Legal review kickoff
Cust services review kickoff
Marketing review
Test Plan
Signoff (Legal, Mktng, CCC)
Instrument
analytics
Instrument
Contact Centre
Offline tagging
QA testing
End-End testing
Launch 90/10%
MonitorLaunch 80/20%
Monitor < 1
week
Launch 50/50%
Go live 100%
Analytics review
Washup and
actions
New hypothes
es
New test design
Rinse and
Repeat!
74
6 - CRO and Testing resources⢠101 Landing page tips : slidesha.re/8OnBRh ⢠544 Optimisation tips : bit.ly/8mkWOB⢠108 Optimisation tips : bit.ly/3Z6GrP⢠32 CRO tips : bit.ly/4BZjcW⢠57 CRO books : bit.ly/dDjDRJ⢠CRO article list : bit.ly/nEUgui⢠Smashing Mag article : bit.ly/8X2fLk
75
END SLIDESFeel free to steal, re-use, appropriate or otherwise lift stuff from this deck.
If it was useful to you â email me or tweet me and tell me why â Iâd be DELIGHTED to hear!
Regards,
Craig.
Recommended