Upload
charity-dynamics
View
62
Download
0
Embed Size (px)
Citation preview
A.B.T. (Always Be Testing)How UX testing can help you meet and exceed program goals
Why Test UX?1
You might be thinking…
• It takes time and expertise to collect & analyze information…
• Testing might expose a need for more extensive work...• This all impacts the cost and schedule of my project…
…So why should I do it?
4
You’re losing money if you aren’t testing.
Testing helps identify problems before time and money are spent creating something that doesn’t
work well.
6
7
8
Testing will help you meet and
exceed your goals.
User
Experience
User
Engagement
User
Satisfaction
Conversions / Meeting
Goals
10
Testing will validate or
invalidate your assumptions.
Challenging Your Expectations
• Did you see what you expected to see?• How are users defying your expectations?• What is working well? • What isn’t working well?
12
Testing will help you think like
your users.
Thinking Like Your Users
• See your content from the viewpoint of someone who isn’t you (your organization, your BOD, etc.)
• Learn things you didn’t even know you should be thinking about
The Testing Blueprint2
When?The time is always…
Testing Blueprint: When to Test – Now
Start testing right now, focusing on these areas:
1. High traffic pages2. Conversion pages3. Email campaigns
Testing Blueprint: When to Test – Optimal Times
Optimal testing times:
1. Before a redesign2. Before EOY campaigns3. Before large fundraising pushes (awareness or action months)4. Don’t test during your largest and most critical campaigns, test
before.
Testing Toolkit
Testing Toolkit: Tools for the Job
• Native Platform Analytics• Google Analytics• Crazy Egg• Google Experiments• Optimizely• Moveable Ink (Email Campaigns)
Testing Toolkit: Native Platform Analytics
• Purpose: Data Collection• Cost: Free/Included with
Platform• Learning Curve: Easy• Examples:
– Jetpack for Wordpress– Luminate Online Reporting
Jetpack for Wordpress
Testing Toolkit: Google Analytics
• Purpose: Data Collection• Cost: Free• Learning Curve: Medium• Setup: Include JavaScript
snippet on every page of your website (or any page you need tracking)
Google Analytics Dashboard
Testing Toolkit: Crazy Egg
• Purpose: Data Collection• Cost: Low• Learning Curve: Easy• Use: View user clicks, hover
and scroll tendencies to determine effect UX design with heat mapping.
Crazy Egg Heat Map
Testing Toolkit: Google Experiments
• Purpose: Variation Testing (A/B/C)
• Cost: Free• Learning Curve: Hard• Setup: Use JavaScript to
change elements, Google Analytics/Experiments handles testing parameters and reporting
Google Analytics Dashboard
Testing Toolkit: Optimizely
• Purpose:– Variation Testing (A/B/C)– Data Collection
• Cost: Enterprise• Learning Curve: Easy• Setup: Insert JavaScript
snippet and run experiments using Optimizely’s platform
Optimizely Experiment
Testing Toolkit: Movable Ink
• Purpose: Email Engagement• Cost: Low (based on email
sends)• Learning Curve: Easy• Use: Engage users with
dynamic images based on location, constituent information or date/time.
Movable Ink Dashboard
Realistic GoalsTo-Do List:
1. Wait for Tonight2. Try and Take Over the World
Brainstorm Goals: The “How Can We?” Question
What are your goals? Start broad with a brainstorming session and generalized language:
“How can we…”– increase donations?– get more email sign-ups?– increase engagement with email campaigns?
Refine Goals: The “What Are We Measuring?” Question
Next, include a clear measurable you can monitor with data:
“How can we increase donation…”– click throughs?– conversions?– average amounts?
Define Goals: The “Would This Work?” Question
Now refined, it’s time to define your goals with an action:
“Would this work?”– A donation lightbox on high-traffic pages to increase donation click-
throughs.– A geo-targeted banner on the homepage to increase event
registrations.– Stronger language in our email campaigns to increase our CTR.
Measure Twice…Test Once
Measuring Success: The First Measure
Before you test, answer these questions:
1. What is your current performance on these pages/campaigns?2. What is the industry and internal organization average
performance?3. How long will we run the test to ensure statistically significant
data?4. What is a realistic expectation for success?
Measuring Success: The Test
While you test, here are your new mantras:
1. Be patient! Stick to the timeline and see it to the end.
2. Monitor consistently
1. Set up reports daily/weekly2. Perform real-life tests periodically3. Tweak if results show dramatic winner half-way through
Measuring Success: The Second Measure
After your test, debrief and decide:
1. Were the results statistically significant?2. Did the test run long enough to ensure a large enough sample
size for scaling?3. Were our expectations met?4. What did we learn?
Testing for a Website3
35
Google Analytics Content Experiment: Donation Intercept Test
GOAL:
Determine the best CTA option to increase traffic to the donation form
TOOL:Google Analytics (GA)• No additional investment
required• GA already set-up to track
the donation funnel
36
Control Lightbox Test
37
Donation Intercept Test: Measures/Outcome
• Control: – Donation button in header– Donation array in footer
• Lightbox Version A:– Lightbox appears after 15
seconds• Lightbox Version B:
– Lightbox appears after 75% scroll
Measures
• Version A resulted in highest traffic generation
• The control and Version A delivered similar numbers of completed donations
• Version B proved not to be a valid option to drive either traffic or donations
Outcome
If you have 1,000 donations per year, imagine what a 10% increase would do to your bottom line!
38
Optimizely: Donation Array Test
GOAL: • Determine the most
valuable offering for donation array
• Determine if there is any variance based on entry point to the donation form
TOOL:
Optimizely• GA didn’t track donation funnel• Easy to replicate and host
multiple forms • Minimal disruption for their
internal web & IT teams• Ability to segment audiences
and select traffic sources to determine differences
39
Donation Array Test
Three Versions:CONTROL VERSION A
(Lower)VERSION B (Higher)
$50 $35 $55
$100 $50 $100
$250 $100 $250
$500 $250 $500
$1,000 $500 $1,500
40
Donation Array Test: Measures/Outcome
Test GroupHome Page
Click-through Rate
High Traffic Internal Page Click-through
RateDonation Form
Completion RateAvg.
Donation
Variation 1 - Control 2.1% 0.014% 10.4% $99
Variation 2 - Lower 1.9% 0.020% 12.8% $106
Variation 3 - Higher 2.2% 0.022% 13.4% $144
2015 Industry Avg. is 7%
Imagine if all of their donations from the first half of the year had come in at $144 vs. $99 avg. gift ?
41
Optimizely : Geo-Targeted Event Recruitment Banners
GOAL: • Determine if we could
convert Home Page visitors to Walk participants
TOOL:
Optimizely• Easy to modify pages without
diving deep into code• Minimal disruption for their
internal web & IT teams• Easy to serve specific creative to
different audiences (geo-targeted)
• Ability to set up timed experiments based on event date
42
Geo-Targeted Event Registration Banners
Desktop Mobile
43
Geo-Targeted Recruitment Banners: Measures/Outcome
• Not a traditional test• No versions to test against• 6 pilot locations• Each banner was coded to display to
a unique visitor within a specified geographic catchment area
• Banners shut-off automatically the day before the event
Measures• Banners resulted in 2.5 – 6%
engagement rate, depending on locations (vs. 0.08% for a traditional paid display ad)
• No change in bounce rates or drop off rates
• Registrations ranged from .5% – 1%
Outcome
What would a 1% increase in event registrations mean for your event?
44
Crazy Egg: Heat Mapping to Inform UX Design
GOAL:
• Re-design or Re-fresh
• Prove that the investment in the current website design still sound
TOOL: Crazy Egg Heat-Mapping• Site already used best practices
for successful event sites• No obvious challenge areas• Observe user behavior to
identify opportunities and gaps• Connects to GA account via
simple code inserted into page • Low monthly fee – very
affordable testing option
Heat Mapping to Inform UX Design
The Site Heatmap Overlay
Are site visitors doing what you want them to do?
Long Scroll PagesWhere are they spending their time?
Are they getting to the most valuable information?
What information are they missing ?
47
Heat-mapping: Measures/Outcome
• Understanding how your site visitors are interacting with your content to inform possible updates or future testing
• Provides navigation cues to streamline content to ensure focus is on your main CTAs: register, donate, log in
• Identifies content placement priorities for long-scrolling pages
Measures
• Proved that the overall structure of the site is good
• Prioritize content on long-scrolling pages
• Optimal placement for social sharing tools
Outcome
Age old question: Website re-design or website re-fresh? Let your site visitors tell you what they need.
Testing for an Email Campaign4
49
Why email testing?
The channel where you are consistently reaching out to communicate directly with your most dedicated supporters.
Really learn about your constituents and include those lessons in your ongoing campaign strategies…
• What inspires them?• What drives them?• What might fall flat?
50
A/B Testing
What is it?
It’s split testing! Comparing two versions of a web page or email to see which one performs better. You do this by splitting your audience and serving up a different variant to find which creative results in a better conversion rate.
Most email marketing tools have built-in, user-friendly A/B testing features.
51
A/B Testing: General vs Hard-Hitting Language
GOAL:
• Prove theory to internal review teams: hard-hitting facts and statements would result in better performance
• Statistically significant results
• Lift KPIs, including amount raised
TOOL: Built in A/B Testing Functionality• Ability to easily split segments
and serve up A/B creative
• Manageable resource lift to create test build from control email build
• Easy reporting to identify winners early in campaign for later sends and, ultimately, determine statistically significant results
52
General Language Version Hard-Hitting Language Version
53
A/B Testing: General vs Hard-Hitting Language
• Developed Marquee Testing strategy over 3 largest campaigns through first half of year for 6 months of data on ONE test
• Followed Engagement KPIs: Open Rates, CTR, Conversion Rate, # of Gifts, Amount Raised, Avg Gift
Measures• Hard-hitting copy had a direct impact not
only on the decision to give but the amount of which the donor decided to give– Amount raised and average gift KPIs
for launch up 100% or more compared to control
• Confidence in a global brand refresh that heads in bold direction
• All teams on the same page to focus on new learnings
Outcome
If hard-hitting language was right for your organization and raised a $30K campaign by 30-40%, you could make an
additional $9,000 to $12,000!
54
Movable Ink
What is it?
Contextual email marketing software that allows you to dynamically change the content of an email upon the moment of open to tailor messaging in real-time.
55
Movable InkWhat you can do with Movable Ink:
• Image Personalization• Geo-Targeted Personalization • Weather Personalization• Device-Targeted Personalization• Data Automation, Image Swaps• Live Polling• Live Social Feeds• Real-Time Optimization
56
Example: Humane Society CTA Button Testing
57
Example: Humane Society Average Gift Suggestion Test
58
ABT: Always Be Testing
• There is no wrong time to start testing
• Testing is way worth the investment
• Start by testing your assumptions
• Test in increments, one element at a time
59
WELCOME TO THE FUTURE.
Coming soon…
Questions?