Tag Archives: user research

5 ways to increase online donations – Free eBook

15 Jul

5 ways to increase online donations

We have put together a free ebook to help non-profit organisations increase online donations. Having spent over 100 hours studying users performing usability tests on non-profit websites, we have a wealth of useful insights to share. This guide outlines small changes that can be made to charity websites, which will encourage a user to donate. It also outlines what is likely to make a customer abandon their donation.

In this post we provide a snippet from the book. You can read Chapter 1 below and then simply Pay with a Tweet to download the full eBook.
Read more

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees

Lunchtime learning – The Architecture of Understanding

6 Jun

The team at Experience Solutions are partial to a lunchtime learning session to share ideas and discuss the latest trends in UX. We thought we’d share some of the videos we’ve been watching to inspire you during your lunch hour. For the next few weeks, every Friday, we’ll share a lunchtime learning video with you.

This week’s video is a great presentation from Pete Morville (@morville) lasting around 30 minutes

Peter Morville quote

Read more

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees

A/B Testing and why making assumptions in UX is a dangerous game

18 Dec

Between projects we like to set up A/B testing experiments on our website which test theories and ideas that come up during client work. Our latest experiment focused on how to encourage users to download a free guide by testing the differences between a large call to action box versus a smaller one. Rather than worry too much about getting the design and styling right, first we wanted to test out which would generate the most conversions and then adopt an ongoing agile approach to eventually lead us to the right solution.

Which test do you think would convert better?

In true ‘which test won’ style, take a look at these two pages and take a stab at which test you think would convert better (the winner will be revealed later in the article):
A/B Testing and why making assumptions in UX is a dangerous game
Read more

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees

Stop waiting for the perfect time to run usability tests

30 Jul

No one is perfect, that's why pencils have erasers

 

“Perfectionism is the enemy of progress”. This is a quote I’ve found to be very true throughout my life. Whenever I hear it or read it, I remember to stop trying to do everything perfectly and focus on getting things done once they are good enough. Once I start focusing on good enough, my to do list shortens, I’m meeting deadlines, and generally feeling like I’m getting somewhere again. So when I see clients doing the same thing I try to encourage them to focus on progress rather than perfection.
Read more

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees

5 ways charities can quickly improve online donations

14 Nov

Donation appeals from charity websites

A report recently published provides an overview of charitable giving in the UK. It says that charity donations are down by 20% since the previous year. The report suggests that the overall number of people donating fell as well as the amount they are giving.

So what can charities do to boost their donations in difficult times? From hours of research with users we found the following 5 actions any charity can take to ensure they keep online donations pouring in.

Read more

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Ali Carmichael

About Ali Carmichael

Ali (or Alasdair) is an experienced project manager who loves his Gantt charts and milestones! He has over 12 years' experience managing successful online experiences for world class brands. Ali is responsible for ensuring our clients love what we do for them. Follow Ali on twitter @AliJCarmichael

Crazy Egg – Our Review

21 Feb

What is it?

Crazy Egg is an online tool that monitors individual pages from your website, giving you a breakdown of where different visitors have clicked and on which part of the screen. There is also some more basic analytics data available on which pages have been viewed most frequently, and where your visitors have come from using different visual displays.

All you have to do is insert a small bit of code into the html of the page you want it to monitor, and sit back and relax. The site does it all for you from then on providing live results as it tracks the progress and every click that visitors make on the chosen pages.
 

What are its advantages?

Simple steps to set it up. Probably the best advantage of Crazy Egg is that it’s easy to setup on any website providing you have access to the site’s code. In just three steps you can set up Crazy Egg to monitor as many pages of your site as you wish (With a free account you can monitor up to 4 pages. To monitor more you have to upgrade to a paid account but prices start at as low as $9 for 10 pages which is peanuts really).

Five different and interactive ways of viewing the results. Crazy Egg provides you with an array of different ways of viewing the results; each has their own uses and will cater to different peoples’ preferences of viewing information.

 

The four different visual views the Crazy Egg provides; (from left to right) heatmap,
scrollmap, confetti, overlay

 
Now all the different views certainly look eye catching and interesting, but the one that we think is the most informative and useful is the confetti visual which displays all the clicks in a colour coded fashion giving you the option to filter which clicks from which external sources you want to see displayed on the page. The other views certainly have their uses too and collectively provide back up evidence to support theories drawn from the overall information gathered. For example, the scroll map in conjunction with the heat or confetti map can give you a good gauge as to where the most focused part of the page is and whether users are clicking on your call to action buttons within those areas.

 

List view

The list view is the last of the different states and displays a table of the items
clicked on within the page

 
Exportable reports of your results. Crazy Egg allows you to export a report of your results which is good for sharing and bringing them to meetings. The different views also allow for an interesting display of information (not the normal pie or bar charts). A slight limitation of this however, is the fact that it merely exports a ‘screenshot’ of your snapshot in the visual that you viewing it in when you click export report.

Crazy Egg data can aid design decisions. The information that Crazy Egg provides about your website pages can prove helpful when discovering what elements of your website need to change – to be moved or altered aesthetically to enhance their use. As I mentioned earlier, the accuracy and specificity of Crazy Egg is really where its advantage over other analytic tools such as Google Analytics lies. In allowing you to see exactly where your users are trying to click, or more importantly where they aren’t trying to click, it allows you to make informed conclusions of where potential areas of improvement are within the site. Finding out what improvements, however, requires more research than Crazy Egg can provide.
 

What are its limitations?

Despite many claims it is not a usability tool. Don’t let the reviews fool you. It can be helpful in allowing you to spot a problem such as a button that does not get used, but it doesn’t give you any other information as to why people aren’t clicking on that button. You can make assumptions from the data that Crazy Egg gives you but each assumption you make could create yet another problem. Still, once the problem is spotted, these questions can easily be resolved by investing in a proper usability test of the site.

The confetti visitor box doesn’t move off the snapshot. When viewing the confetti results there is a little black box that lists the different visitor information for you in one area. However, the box is within the captured shot or ‘screenshot’ of your page, and so at no point can you see the whole page on its own. I realise you can minimise the box but even then I found it distracting. This might just be a pernickety personal irritation but I found it quite frustrating when trying to see the spread of clicks across the page.

 

The information on each type of visitor is displayed in the box above which cannot leave the screen

 
You can only compare two ‘snapshots’ via the heatmap view. Crazy Egg allows you to compare two or more page results within the site; though you can only compare them within the heatmap view. This can become tiresome especially if you wanted to see the comparison of specific clicks between pages.
 

The comparison feature on Crazy Egg only allows for a comparison of the heatmap view

 

When we recommend using it

We would recommend using it if you want to gain an insight into what your users are doing, in a very basic form, when they come onto your site – i.e. what links are being used and which seem to be ignored. This can help in making design decisions to improve your users’ interactions with your site. However each of these decisions still includes a large element of guess work which is why user testing is always a thorough technique to be used in conjunction with site analytics to ensure that you know exactly what needs to be changed, why, and what it needs to be changed to.

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Samantha Harvey

About Samantha Harvey

Sam recently graduated from Visual Communication. She joined our team in April 2011 and has been conducting user research and has been making sure our user interfaces follow good design principles. She's keen to point out our poor selection of fonts... er I mean typography (sorry Sam). Follow Samantha on twitter @samharvey_ux

Playtesting – why you should user test designs early and often

26 Jan

This is a fantastic video from Penny-Arcade about the virtues of testing concepts and designs early and often with users. The video is aimed at games designers but applies to any designer and translates particularly well to web designers.

It’s not often that we share videos on our blog, but this one is too good not to shout about it. It delivers the message we tell our clients, but in a much better and more enjoyable medium. If you don’t mind sitting through the annoying advert at the beginning, you’ll find the reward is absolutely worth it. Enjoy!

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees

Loop 11 – Our Review

6 Jan

What is it?

Loop11 is an online, remote usability testing tool. In simple terms Loop 11 allows you to create an online survey that includes tasks for users to complete on a website of your choice. You have the ability to mix questions with tasks in any order you wish, and for each task you can set a goal for the user to achieve i.e. find a size 10, blue women’s jumper for under £30. You then have to set the URL that they will start on at the beginning of the task. There is a demo of how it works on the website which takes you through what your users will see when they complete a test.
 
Tasks & Questions

A step by step process of setting up your test

 
Below are two examples showing how the test looks to the user when they are completing a task (Figure 1) or answering a question (Figure 2). On each task the objective is always displayed at the top along with the progression buttons. When asking a question, Loop11 gives you various question types which allow the users to answer in an assortment of different ways, the example I have shown represents a normal text field answer/comment box.
 
Task Example

 Figure 1. An example of how a task would look like to a user 

 
Example Question

Figure 2. You can insert questions after, or before the tasks, above is an example of a text field question

 

What are its advantages?

This is the closest tool we have found so far to an actual usability test. Loop11 is one of the only sites we have found that really lets you create something similar to the usability tests we regularly carry out. You have control over the design of a proper test script which is the main reason we have found it to be so useful.

You don’t have to be there with the user (in the same room). Loop11 tests are un-moderated so all you have to do is design it, then set it live and spread the word by sending a link to your selected participants. Users can then complete the tasks in their own home at their own pace. Its main advantage over face to face usability tests is that it allows you to test as many users as you want.

Loop11 isn’t free, but it is cheaper than moderated usability testing. You also don’t have to spend much money on incentives as users are participating when they want, and where they want (we’ve written more about this on UXbooth). We still included a small incentive as a thank you for users spending their time completing the test, and as we made sure the test itself wasn’t long it worked out rather well for everyone.

You can test Loop11 on any device that browses the internet. We haven’t tested this but we would assume that in most cases this would be correct as Loop11 uses the browser and is fully online so it shouldn’t matter too much what device you use it on.

View ready made reports on participants completing your test. After you have set the test live and participants start completing it you can view reports which show how many people succeeded in completing tasks, how long it took them, and how many pages they went through to get there. This information comes in 3 exportable formats (Excel XML, PDF, CSV, and Excel XML with participants) so that you have access to the original data to do with as you please. The PDF option also exports the report version which includes graphically presented results for the overall test, and for each task itself, however we found that the excel document of raw data was the most useful as it allowed us to work with the data to produce a report with the information we required. We could then brand it and use it within our projects.

 
Sample Report

A sample version of the PDF report that Loop11 exports after a test is finished

 

What are its limitations?

Once a test is published there is no going back. You can’t edit the test after publishing it; you can only delete it altogether, so you better get it right the first time! We would recommend thinking carefully about the order of your test before adding them to the test list. Re-test, and double check as many times as possible in preview mode before putting it live.

Judging the success or failure rate can be tricky. The site tests live websites so for each task you have to set the URL you want users to start from. In order to track success and fail tracking you need to add URLs which specify the pages users reach which you consider a to count as a success. This can be problematic if your task can be completed in a variety of different ways. If you don’t anticipate them all, you could record a failure even if they succeed.

Tasks need to be carefully designed. The design of each task becomes critical when doing an un-moderated test especially with this tool as it needs to be much less fluid than a typical face to face usability test where normally we would deliberately allow users the freedom to complete the task as they would naturally. With Loop11 you are forced to be more quantitative with your approach to get a more defined fail/success criterion. Therefore we found that the tool forced us to design tasks to be much more basic and definite.  For example, in a face to face test we might ask users to define the problem they typically face and then show us how they would solve that problem using the site. With Loop 11 we would design the task to answer a very specific question we know they can find an answer to on a specific page.

Slow loading times on some websites. Sometimes the website you may want to use will perform slowly which could affect how likely people are to complete the test. We also noticed it crashed on a few sites too. We recommend checking how well a site performs in the preview mode before you commit to purchase. Loop11 do provide a piece of JavaScript that you can insert into the body text of the site to enhance the speed. Unfortunately we couldn’t do this on the site we were using which is a drawback if you are testing a site without access to the backend.

You might get more people participating than you bargained for. One quite costly drawback, especially for those of us who incentivise participants, is that despite setting a limit for the number of tests it only cuts them off after that number have completed the form. However, it cannot then stop people who have already started the test whilst the 20th (in this example) person was completing it. Therefore you may end up paying out for slightly more people than you originally wanted (we set a limit of 20 but had to pay 24 participants).

You don’t know who really completed the test properly. Probably the most obvious limitation when it comes to un-moderated testing is. Obviously you would like to think that anyone who sat down to do the test would want to do it properly, but you will get those who just rush through the tasks and questions to gain the incentive at the end. Normally you can look at the results and usually spot the person who you think didn’t complete it properly, the people who took the shortest time to complete their tasks and didn’t write any comments. The question you then have to ask yourself is whether you discount them from the testing?
 

When we recommend using it

If you’re looking for a quick, high level view of how a website is doing then this would be a good solution for you. It also helps if you are having problems testing users who are geographically spread or too time poor to meet one to one, by using Loop11 you can overcome this by testing anywhere that has internet accessibility anytime they want. Due to Loop11’s nature it is a useful tool for benchmarking a site, finding out where users are running into an issue to then carrying out some face to face usability testing to find out exactly where the problem lies. The scenario where we are likely to use it is to monitor the usability of sites we have already worked on. However, as the tool has strict success/fail metrics it is only really suited to carefully designed tasks which have a clear answer.
 
See anything I’ve missed? Just let me know!

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Samantha Harvey

About Samantha Harvey

Sam recently graduated from Visual Communication. She joined our team in April 2011 and has been conducting user research and has been making sure our user interfaces follow good design principles. She's keen to point out our poor selection of fonts... er I mean typography (sorry Sam). Follow Samantha on twitter @samharvey_ux