Travel Insurance that cares | UX Case Study on World Nomads
The Design Process
Step 1. The Brief:
Provide validated solutions to combat purchase path drop-offs on the World Nomads mobile site.
What do we know?
We had a lot of questions for the client around the brief. We walked ourselves through the mobile purchase path to get familiar with it, then met with the client to discuss the perceived problem.
In terms of the mobile drop-off, our client, the UX lead, provided us with data to show where the two main drop-off points were. He then showed us a comparison between the desktop drop off rate and the mobile drop off rate. It was obvious that the mobile site had more drop-offs, so we needed to understand why.
This became the catalyst to our research. We talked with the client about the importance of the mobile site vs the desktop site. He made us aware that a significant amount of travel insurance purchases were made on the day of travel, from a mobile device, so we needed to solve the drop off issue as a matter of urgency.
We were given a fair amount of user research to work with:
-Recorded mobile user testing sessions
-The company ‘mindset’ which is what they use rather than a persona
We asked a lot about this data. I love to validate through contextual relevance, so I wanted to know; when the tests were conducted, in what country (considering it’s a global brand), what tasks were the users asked to complete, how often the business conducts research and what are the trends they see? We also asked about their largest competitors & their points of difference.
Step 2. What is the problem & how do we validate it?
Data pointed to two areas on the mobile site, so our problem area was relatively well defined.
What the client wanted:
To increase the mobile conversion rates for their travel insurance product.
What that meant to us: Identifying the pain points on the drop off pages and find out how to remove these hurdles.
We decided that analysing the provided data should be the first step in this task.
-I started to scribe out the user recordings, we then affinity mapped them looking for patterns in pain points. We could tell pretty quickly that the amount of scrolling was a definite issue, along with the site navigation/IA. In the recordings, many of the users couldn’t complete the given task, becoming frustrated they left the test uncompleted. We were able to see that the points of contention for the users in the screen recordings, were backed up by the heat maps/scroll maps and drop off data.
- We decided that a combined technique of interviewing & user testing of the existing site would be the best way to find out the qualitative information we needed; what was driving people away from the site and why?
The research side of UX is my favourite part of the puzzle. I enjoy getting to the root of the problem, understanding why people are getting frustrated and where improvements can be made to enhance their experience and make it a memorable one.
Step 3. Clarify with Qualitative data
Why were users dropping off at these two points?
To gain understanding, we researched in 3 forms:
1. Onsite Survey: We wanted to understand from the actual users, why they were struggling? We knew that the mobile site had high traffic, so we wanted to capture some insights there. We spoke with the stakeholder and were granted permission to input an onsite mobile survey. We added two questions at different points of the purchase path:
-The 1st question was at the start of the ‘customise your policy’ page, which was drop off point #1. This question was designed to asked customers ‘Are there any questions you are not finding the answers to?’ The aim was to clarify why they were discouraged to proceed further.
-The 2nd question was for those who made it to the end and successfully bought travel insurance. At this point, we asked them ‘What nearly stopped you from purchasing travel insurance from us today’. This question was there to identify the most dominant point of issue with the purchase path.
This survey was unfortunately only live for one day. If we could have kept it live for a little longer the results would have been much more telling. We did get about 10 responses from each of the questions, so that was a good start.
2. Heuristic analysis: We conducted a heuristic analysis of World Nomads and their top 3 competitors. We compared their purchases paths at the same stages of drop off to see how their IA is structured.
3. 1:1 Interview and User Testing with x5 users
We combined interviews with user testing of the live site, setting tasks for participants to accomplish which would take them through the drop off zones.
My group was happy for me to conduct the 1:1 interview/testing as that’s a strong skill of mine. I really enjoy understanding how the user thinks, what they expect & what delights them.
We interviewed x5 people over the course of the day. They were not World Nomad users but were active travellers who had recently purchased travel insurance.
In the sessions we gave them two tasks to complete:
The first task was the same as the task in the user testing, we had scribed out earlier on. It was to book insurance for themselves for a trip and include cover for their iPhone & camera and a hiking adventure.
The second task was to book insurance for themselves and a friend, cover their phone, camera and add cover for skiing and ice climbing too.
Step 4. Data Analysis
After the interviews were conducted and survey results came back, we were able to affinity map the qualitative data with a deeper understanding of the ‘why’.
We decided to map our research according to each page of the purchase path and divided feedback into positive and negative information.
There was a lot more negative than positive feedback so we had a lot to work with. Especially on the drop off pages which was great.
We gained insight into exactly what the users were expecting the purchase path to deliver and what experience they were faced with in reality.
The interviews & user testing provided a lot of clarity around the expectations of buying travel insurance, and where World Nomads was falling short. There are patterns which have evolved over time with insurance sites. Users are quite clear about what they want to achieve and how long a task should take. Understanding that most of the mobile purchases came from last minute coverage on the day of travel. We deduced that users need clear, concise, directional information about their travel insurance. We had to restructure and increase clarity to enable users to navigate successfully around the site.
We created a problem statement:
‘As a World Nomad user, I need to be able to buy travel insurance quickly and easily on the go from my mobile, and know exactly what I’m covered for’
From the data clusters, we created the insights, objectives and story cards to aid our discovery stage.
Insight 1: Lack of clarity around websites purpose.
Story card: As a travel insurance seeker, I need to know I’m in the right place so that I can buy travel insurance.
User validation: Through user testing/interviews, we were able to understand that the value proposition of the site was not abundantly clear when first landing on the mobile site home page. You have to scroll down a fair way before you can see the ‘get a quote’ feature or go to the hamburger menu if you don't make it that far in the scroll.
Our objective: Improve the clarity of the homepage value proposition.
Insight 2: Unintuitive ‘adding traveller’ feature, doesn’t conform to standard formatting.
Story card: As a travel insurance seeker, I need to add extra travellers to the policy simply and intuitively, so that I can quickly proceed to the next step.
User validation: During user testing, we asked users to book travel insurance for themselves and a friend. Almost all of them were confused about how to do this. The ‘adding traveller’ section was not clear, most users misunderstood it completely.
“I can’t find anything…I didn’t see that you could enter the ages of each” — User quote
Many of the users expected a ‘traveller 1’ & ‘traveller 2’ field.
Our objective: Redesign the ‘Extra Traveller’ feature in line with expectations.
Insight 3: Navigation and layout need improvement. The quote process is confusing to users, content needs more clarity.
Story card: As a travel insurance seeker, I need to understand the differences between policies so that I can understand what I am covered for.
User validation: From the affinity mapping, we determined:
1. There was a hefty amount of scrolling each user had to undertake to compare the different policies.
2. When users hit the ‘find out more’ option on the different policies, they expected to find out about the plans, but the dropdown actually showed the options to donate to charity.
3. The policy inclusions were hard to identify.
4. Adding activities was tricky for users.
Our objective: Improve the clarity and simplicity of the quote page.
Insight 4: Layout and IA needed revisiting, lack of clarity leads to confusion.
Story card: As a travel insurance seeker, I need to customise my policy quickly and easily so that I know what extras I am, and am not covered for.
1. User testing showed that the difference between the policies was not clear. Users asked for a comparison table and wanted to be prompted to switch plans to get better coverage.
2. Wanted more transparency around items which were not covered under the plan.
3. There needed to be a change in the structure of the progress bar to show which steps were left to complete.
4. Confirmation that you had successfully added coverage for items or activities. There is no summary page to show you your added item/activities. Users were expecting to see this to make sure they hadn't forgotten anything.
Our objective: Improve speed, clarity and simplicity of the page.
Step 5. Design — IA — Ideation — Wireframing
We used the x4 story cards and our heuristic competitor analysis to guide our wireframe design and iterate simultaneously.
We walked through the purchase path step by step, iterating & designing as we went. This enabled us to create a simpler navigation process, removing hurdles and introducing desired features.
Each wireframe was created taking into account where we needed to simplify and clarify e.g. Insight 2 was about the additional traveller. The way the traveller section was set out on the quote page made it confusing and unclear to understand if you could get a quote for x2 people. To address this we used the competitor analysis combined with the expectations from the users. Our iteration led to us making two boxes saying Traveller 1 & Traveller 2.
Once we were set on the wireframing & IA (having addressed each of the story cards), we were ready to crack into prototyping.
Step 6. Prototyping/User testing
We created a high fidelity prototype of the entire purchase path in Adobe XD which we tested with x5 users.
We created a script and asked users for their expectations and feelings throughout the iterated purchase path journey.
The prototype was created to test out the specific details that we had identified as pain points to the user.
A good example of this is in the first round of user testing we asked the users to obtain a quote for x2 people. The majority of users couldn’t complete this task and got frustrated.
The current site looks like this:
Even though the traveller box can have multiple travellers added, the majority of the users didn’t see this in the mobile site and bypassed it completely which frustrated them even more. We noted this and added it into our iteration for our prototype below:
You can see above that we made a simple change and created two traveller fields. This completely eliminated the confusion. Each user added the second traveller without hesitation in our prototype testing round.
Another more significant change was creating a comparison table for the two different insurance products.
On the current mobile site, you could only view the plans one after the other which means a lot of scrolling up and down. There was also no option to compare the plans. We decided to introduce a comparison feature to combat this (the far right page):
Feedback from testers actually noted that they didn’t desire the individual pages at all. The comparison page alone was all the information they required.
We also amended the way users were prompted to add activities after adding high-value items.
In the current site, they are two vastly different processes which result in confusion. See below:
We wanted to ensure that the pattern of the mobile site was repeated for the users. You can see that there are icons to click to add your high-value items which is easy for users to see. When you want to add in activities there is only a text bar that some users skimmed right over.
We decided to separate these tasks by moving them onto two different pages.
We kept the high-value items as they are, then added a page where you can cover your activities in the same way.
No users struggled to add activities when we tested our prototype.
Lastly, we made a change to ensure that activities not covered by the policy were made obvious by red shading.
In the live mobile site, activities not covered appear with a small red cross that can be easily missed. We decided to make it more transparent by shading the whole box in red. During our testing, it was obvious right away to the users if they were not covered for an activity.
Step 7. The handover
As this was a waterfall project, we were able to hand over the validated solutions from user testing and deliver on their brief.
We could have easily spent another two weeks on this project. I would have loved to refine and test the prototype again.
We met with the client and walked him through our process from end to end & delivered an executive summary. He was pleased with what we had accomplished as a team in such a small amount of time.
Step 8. Next Steps/Outcomes
We handed the company all of our insights and let them know about the quick wins that don’t require a lot of time, money and hassle to implement, like the comparison page.
A new comparison page has now been added to the live website (see image below). We hope that a mobile rollout is on the way soon!
During this task, I felt the research insights were very strong. We were able to give validated and tested insights to the company. Stakeholder communication was great throughout the process, they were engaged and happy to see where the research pointed. We delivered to the brief, giving clear and proven iterations to help combat their mobile drop off issue.