RSS feed for this section

Usability Reviews

We interact with products and services everyday and sometimes we encounter experiences which we just have to write about. We will use this section to share our usability reviews with you.

21 things we like about the new M&S website

21 Feb

We noticed yesterday that the new Marks & Spencer website redesign went live. Here’s a quick summary of the changes we felt were most interesting. More research would be needed for us to give a thorough UX opinion but our first thoughts are that it’s a positive redesign.

M&S website redesign

In this article, we highlight the 21 UX improvements made to the new Marks & Spencer website and why we like them.
Read more

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees

How to turn social media engagement into ecommerce sales

18 Feb

We often receive requests from companies in need of UX advice, and thought this would be relevant to share with our readers. The founder of Isabel and I, a relatively new Australian clothing brand, got in touch asking how to convert a growing social media following into increased paying visitors to the ecommerce site. Her ultimate goal was to ‘convert likes into sales’.

We conducted a user experience audit of the site, where we place ourselves in the shoes of users and travel through the site attempting to complete common customer goals. During our audit we identified several areas where we felt the site would benefit from improvements to the user journey to increase conversions. After sending our report of Isabel and I, Aundrea the owner found our findings really useful. We wanted to share them on our blog in the hope that you would find them useful too.

 

Isabel and I

Aundrea Quote
Read more

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Jenny Coford

About Jenny Coford

Jenny is a Graphic Design graduate with a passion for communication, who joined our team in November 2013. She has been busy immersing herself in the world of UX, creating Axure prototypes and researching the latest digital trends to share with you. She has a real obsession with organisation, so can usually be found writing the next office to-do list. Follow Jenny on twitter @jennycoford

14 leading insurance providers’ quote processes and what you can learn from them

24 Jan

According to Insurance Business Online, last year 69% of insurance policies* were acquired online, yet we see many users struggling with the usability of quotation processes in usability testing. Insurance companies still neglect the quotation and application process on their websites and present poorly designed, unintuitive and confusing forms to their prospective customers which in turn could see them missing out on conversions.

14 leading insurance providers' quote process and what you can learn from them

We reviewed the usability of these sites to see which one offers the best life quotation experience

 
In this post, we analyse the life insurance quote process of 14 leading insurance companies and comparison websites to see which sites offer the overall best usability and user experience.  We also provide explanation as to what makes a good quote form and how insurance companies can consider implementing changes on their website.
Read more

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Oliver Gitsham

About Oliver Gitsham

Oli is a Senior User Experience Designer with 8 years experience of researching and designing digital user interfaces. Oli has just become a Dad for the first time so we're expecting some rants about buggy usability anytime now. Follow Oli on twitter @olivergitsham

10 ways ASOS convert visitors to buyers

21 Nov

Online fashion retailer ASOS has been in the news once again for another year of soaring success and at a time when other retailers are reporting another quarter of ‘difficult trading conditions’. Rob Bready, Product and Trading Director at ASOS attributed the success to the user experience of the website (and the free delivery)….

“It is very simple – the site is beautiful, easy to use and delivery is free”

10 ways Asos convert visitors to buyers
So this got us thinking. What is it exactly that makes Asos.com so easy to use? What can other online retailers learn from them that could help improve the user experience of their website and lead to better online conversions?

We put together the top 10 things that online retailers can learn from Asos
Read more

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Oliver Gitsham

About Oliver Gitsham

Oli is a Senior User Experience Designer with 8 years experience of researching and designing digital user interfaces. Oli has just become a Dad for the first time so we're expecting some rants about buggy usability anytime now. Follow Oli on twitter @olivergitsham

Insight from usability testing – how to get more online donations

6 Sep

When usability testing charity websites we see the same user need being unfulfilled time and again. Before making a decision to donate, volunteer, or fundraise for you, users need to know where the money goes.

They’ve heard about charities eating up all the money themselves and only a small amount getting to the people who need it. They want to know your charity isn’t like that. But you also know that users aren’t going to read your AGM notes and won’t invest time reading about your financial structure. So what do you do? In this article we’ll show you some of the sites doing it well and give you some inspiration on how to fix your site to generate more online donations.

Read more

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees

Crazy Egg – Our Review

21 Feb

What is it?

Crazy Egg is an online tool that monitors individual pages from your website, giving you a breakdown of where different visitors have clicked and on which part of the screen. There is also some more basic analytics data available on which pages have been viewed most frequently, and where your visitors have come from using different visual displays.

All you have to do is insert a small bit of code into the html of the page you want it to monitor, and sit back and relax. The site does it all for you from then on providing live results as it tracks the progress and every click that visitors make on the chosen pages.
 

What are its advantages?

Simple steps to set it up. Probably the best advantage of Crazy Egg is that it’s easy to setup on any website providing you have access to the site’s code. In just three steps you can set up Crazy Egg to monitor as many pages of your site as you wish (With a free account you can monitor up to 4 pages. To monitor more you have to upgrade to a paid account but prices start at as low as $9 for 10 pages which is peanuts really).

Five different and interactive ways of viewing the results. Crazy Egg provides you with an array of different ways of viewing the results; each has their own uses and will cater to different peoples’ preferences of viewing information.

 

The four different visual views the Crazy Egg provides; (from left to right) heatmap,
scrollmap, confetti, overlay

 
Now all the different views certainly look eye catching and interesting, but the one that we think is the most informative and useful is the confetti visual which displays all the clicks in a colour coded fashion giving you the option to filter which clicks from which external sources you want to see displayed on the page. The other views certainly have their uses too and collectively provide back up evidence to support theories drawn from the overall information gathered. For example, the scroll map in conjunction with the heat or confetti map can give you a good gauge as to where the most focused part of the page is and whether users are clicking on your call to action buttons within those areas.

 

List view

The list view is the last of the different states and displays a table of the items
clicked on within the page

 
Exportable reports of your results. Crazy Egg allows you to export a report of your results which is good for sharing and bringing them to meetings. The different views also allow for an interesting display of information (not the normal pie or bar charts). A slight limitation of this however, is the fact that it merely exports a ‘screenshot’ of your snapshot in the visual that you viewing it in when you click export report.

Crazy Egg data can aid design decisions. The information that Crazy Egg provides about your website pages can prove helpful when discovering what elements of your website need to change – to be moved or altered aesthetically to enhance their use. As I mentioned earlier, the accuracy and specificity of Crazy Egg is really where its advantage over other analytic tools such as Google Analytics lies. In allowing you to see exactly where your users are trying to click, or more importantly where they aren’t trying to click, it allows you to make informed conclusions of where potential areas of improvement are within the site. Finding out what improvements, however, requires more research than Crazy Egg can provide.
 

What are its limitations?

Despite many claims it is not a usability tool. Don’t let the reviews fool you. It can be helpful in allowing you to spot a problem such as a button that does not get used, but it doesn’t give you any other information as to why people aren’t clicking on that button. You can make assumptions from the data that Crazy Egg gives you but each assumption you make could create yet another problem. Still, once the problem is spotted, these questions can easily be resolved by investing in a proper usability test of the site.

The confetti visitor box doesn’t move off the snapshot. When viewing the confetti results there is a little black box that lists the different visitor information for you in one area. However, the box is within the captured shot or ‘screenshot’ of your page, and so at no point can you see the whole page on its own. I realise you can minimise the box but even then I found it distracting. This might just be a pernickety personal irritation but I found it quite frustrating when trying to see the spread of clicks across the page.

 

The information on each type of visitor is displayed in the box above which cannot leave the screen

 
You can only compare two ‘snapshots’ via the heatmap view. Crazy Egg allows you to compare two or more page results within the site; though you can only compare them within the heatmap view. This can become tiresome especially if you wanted to see the comparison of specific clicks between pages.
 

The comparison feature on Crazy Egg only allows for a comparison of the heatmap view

 

When we recommend using it

We would recommend using it if you want to gain an insight into what your users are doing, in a very basic form, when they come onto your site – i.e. what links are being used and which seem to be ignored. This can help in making design decisions to improve your users’ interactions with your site. However each of these decisions still includes a large element of guess work which is why user testing is always a thorough technique to be used in conjunction with site analytics to ensure that you know exactly what needs to be changed, why, and what it needs to be changed to.

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Samantha Harvey

About Samantha Harvey

Sam recently graduated from Visual Communication. She joined our team in April 2011 and has been conducting user research and has been making sure our user interfaces follow good design principles. She's keen to point out our poor selection of fonts... er I mean typography (sorry Sam). Follow Samantha on twitter @samharvey_ux

Loop 11 – Our Review

6 Jan

What is it?

Loop11 is an online, remote usability testing tool. In simple terms Loop 11 allows you to create an online survey that includes tasks for users to complete on a website of your choice. You have the ability to mix questions with tasks in any order you wish, and for each task you can set a goal for the user to achieve i.e. find a size 10, blue women’s jumper for under £30. You then have to set the URL that they will start on at the beginning of the task. There is a demo of how it works on the website which takes you through what your users will see when they complete a test.
 
Tasks & Questions

A step by step process of setting up your test

 
Below are two examples showing how the test looks to the user when they are completing a task (Figure 1) or answering a question (Figure 2). On each task the objective is always displayed at the top along with the progression buttons. When asking a question, Loop11 gives you various question types which allow the users to answer in an assortment of different ways, the example I have shown represents a normal text field answer/comment box.
 
Task Example

 Figure 1. An example of how a task would look like to a user 

 
Example Question

Figure 2. You can insert questions after, or before the tasks, above is an example of a text field question

 

What are its advantages?

This is the closest tool we have found so far to an actual usability test. Loop11 is one of the only sites we have found that really lets you create something similar to the usability tests we regularly carry out. You have control over the design of a proper test script which is the main reason we have found it to be so useful.

You don’t have to be there with the user (in the same room). Loop11 tests are un-moderated so all you have to do is design it, then set it live and spread the word by sending a link to your selected participants. Users can then complete the tasks in their own home at their own pace. Its main advantage over face to face usability tests is that it allows you to test as many users as you want.

Loop11 isn’t free, but it is cheaper than moderated usability testing. You also don’t have to spend much money on incentives as users are participating when they want, and where they want (we’ve written more about this on UXbooth). We still included a small incentive as a thank you for users spending their time completing the test, and as we made sure the test itself wasn’t long it worked out rather well for everyone.

You can test Loop11 on any device that browses the internet. We haven’t tested this but we would assume that in most cases this would be correct as Loop11 uses the browser and is fully online so it shouldn’t matter too much what device you use it on.

View ready made reports on participants completing your test. After you have set the test live and participants start completing it you can view reports which show how many people succeeded in completing tasks, how long it took them, and how many pages they went through to get there. This information comes in 3 exportable formats (Excel XML, PDF, CSV, and Excel XML with participants) so that you have access to the original data to do with as you please. The PDF option also exports the report version which includes graphically presented results for the overall test, and for each task itself, however we found that the excel document of raw data was the most useful as it allowed us to work with the data to produce a report with the information we required. We could then brand it and use it within our projects.

 
Sample Report

A sample version of the PDF report that Loop11 exports after a test is finished

 

What are its limitations?

Once a test is published there is no going back. You can’t edit the test after publishing it; you can only delete it altogether, so you better get it right the first time! We would recommend thinking carefully about the order of your test before adding them to the test list. Re-test, and double check as many times as possible in preview mode before putting it live.

Judging the success or failure rate can be tricky. The site tests live websites so for each task you have to set the URL you want users to start from. In order to track success and fail tracking you need to add URLs which specify the pages users reach which you consider a to count as a success. This can be problematic if your task can be completed in a variety of different ways. If you don’t anticipate them all, you could record a failure even if they succeed.

Tasks need to be carefully designed. The design of each task becomes critical when doing an un-moderated test especially with this tool as it needs to be much less fluid than a typical face to face usability test where normally we would deliberately allow users the freedom to complete the task as they would naturally. With Loop11 you are forced to be more quantitative with your approach to get a more defined fail/success criterion. Therefore we found that the tool forced us to design tasks to be much more basic and definite.  For example, in a face to face test we might ask users to define the problem they typically face and then show us how they would solve that problem using the site. With Loop 11 we would design the task to answer a very specific question we know they can find an answer to on a specific page.

Slow loading times on some websites. Sometimes the website you may want to use will perform slowly which could affect how likely people are to complete the test. We also noticed it crashed on a few sites too. We recommend checking how well a site performs in the preview mode before you commit to purchase. Loop11 do provide a piece of JavaScript that you can insert into the body text of the site to enhance the speed. Unfortunately we couldn’t do this on the site we were using which is a drawback if you are testing a site without access to the backend.

You might get more people participating than you bargained for. One quite costly drawback, especially for those of us who incentivise participants, is that despite setting a limit for the number of tests it only cuts them off after that number have completed the form. However, it cannot then stop people who have already started the test whilst the 20th (in this example) person was completing it. Therefore you may end up paying out for slightly more people than you originally wanted (we set a limit of 20 but had to pay 24 participants).

You don’t know who really completed the test properly. Probably the most obvious limitation when it comes to un-moderated testing is. Obviously you would like to think that anyone who sat down to do the test would want to do it properly, but you will get those who just rush through the tasks and questions to gain the incentive at the end. Normally you can look at the results and usually spot the person who you think didn’t complete it properly, the people who took the shortest time to complete their tasks and didn’t write any comments. The question you then have to ask yourself is whether you discount them from the testing?
 

When we recommend using it

If you’re looking for a quick, high level view of how a website is doing then this would be a good solution for you. It also helps if you are having problems testing users who are geographically spread or too time poor to meet one to one, by using Loop11 you can overcome this by testing anywhere that has internet accessibility anytime they want. Due to Loop11’s nature it is a useful tool for benchmarking a site, finding out where users are running into an issue to then carrying out some face to face usability testing to find out exactly where the problem lies. The scenario where we are likely to use it is to monitor the usability of sites we have already worked on. However, as the tool has strict success/fail metrics it is only really suited to carefully designed tasks which have a clear answer.
 
See anything I’ve missed? Just let me know!

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Samantha Harvey

About Samantha Harvey

Sam recently graduated from Visual Communication. She joined our team in April 2011 and has been conducting user research and has been making sure our user interfaces follow good design principles. She's keen to point out our poor selection of fonts... er I mean typography (sorry Sam). Follow Samantha on twitter @samharvey_ux

TheClickTest.com – Our review

11 Nov

What is it?

The Click Test is a simple and quick test that allows you to upload an image (that image can contain one or two versions of a design for viewers to choose from) and ask the viewer a question which can be answered by them clicking on a certain part of the design. It’s part of a suite of tools offered by the UsabilityHub (we’ll review the other tools later).

A good example for how theclicktest.com would work is if you were to upload a screenshot of a homepage you are designing and asking the viewer to click on where they think they would find the contact details. You don’t have to just use it for web designs though, in the example below it is being used to determine which picture the majority thinks best represents the description.
 

Example of theclicktest.com

A screenshot of the tool in action where users are invited to click the option they feel works best

 

What are the advantages of using theclicktest.com?

The clicktest.com is simple and can be very useful when trying to answer basic questions about interface design. If you need a quick response to confirm a suspicion then the click test only takes a minute to load and is free if you collect ‘Karma points’ by completing a few of the other tests that members have uploaded i.e. you complete 2 random Click Tests on the site you earn 2 Karma points, which in turn allows you to have 2 people take your test. For those of us in a hurry, or who need a large amount of people to fill out the tests, there is also the option to purchase karma points or to simply email a URL to existing contacts.

The results start being collected instantly and due to the popular nature of the site and the random order tests are given to viewers, you are bound to start collecting results within the hour. Every time we have used the Click Test we have only had to wait a day to collect the desired amount of results we wanted.

The results come in three different forms; a plasma map, heat map, and a click map;
 

Snapshot of the result maps

A snapshot of the results for choosing a colour scheme (plasma, heat, & click test from left to right)

 
We found the click map the most useful as it shows clearly how many people clicked on what graph (in the above example), nonetheless the other maps are visually pleasing and with a higher volume of participants and a different sort of test could prove just as useful!

What are the limitations of using theclicktest.com?

Questions that are more complex might not be as reliable, as the tests are meant to be quick, and determining how much time someone spent reading the question and thinking about the answer is near impossible. This brings me to the main problem that the click test has, and in fact all small, short tests online. You cannot tell whether the answers you are being given are true as people may be participating in the test to earn Karma points and so not really taking notice of the test questions and just clicking anywhere to complete it.

Another issue that comes with allowing anyone to partake in your test is that other than people you can recruit yourself, you do not know whose opinion you are taking note of which may be detrimental to your design.

In addition, the tool only records one click, the last click a viewer makes on the page before finishing. This means the questions are limited as you need to only have one answer in order for the test to work. We found this out the hard way by uploading a test that required the viewer to click on the ‘two best colour schemes’. However when we checked to see how the results were doing we discovered that only one click had been recorded for each individual who had participated.

When should I use it?

So far we’ve found it to be a useful tool to get opinions on basic design elements such as colour scheme choice and chart style. But we haven’t relied upon this data alone. We’ve used it to guide decisions, and have then followed up with face to face user feedback before making a final call. By all means use it to test web designs and guide any other design decisions that will fit into a one click answer question, but don’t let it be the only result that you use to make a final decision.

If you can think of anything else you would like us to cover in future reviews please get in touch!

 

 

Liked this article?

Get more specialist retail and finance UX insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Samantha Harvey

About Samantha Harvey

Sam recently graduated from Visual Communication. She joined our team in April 2011 and has been conducting user research and has been making sure our user interfaces follow good design principles. She's keen to point out our poor selection of fonts... er I mean typography (sorry Sam). Follow Samantha on twitter @samharvey_ux