Tag Archives: user feedback

Loop 11 – Our Review

6 Jan

What is it?

Loop11 is an online, remote usability testing tool. In simple terms Loop 11 allows you to create an online survey that includes tasks for users to complete on a website of your choice. You have the ability to mix questions with tasks in any order you wish, and for each task you can set a goal for the user to achieve i.e. find a size 10, blue women’s jumper for under £30. You then have to set the URL that they will start on at the beginning of the task. There is a demo of how it works on the website which takes you through what your users will see when they complete a test.
 
Tasks & Questions

A step by step process of setting up your test

 
Below are two examples showing how the test looks to the user when they are completing a task (Figure 1) or answering a question (Figure 2). On each task the objective is always displayed at the top along with the progression buttons. When asking a question, Loop11 gives you various question types which allow the users to answer in an assortment of different ways, the example I have shown represents a normal text field answer/comment box.
 
Task Example

 Figure 1. An example of how a task would look like to a user 

 
Example Question

Figure 2. You can insert questions after, or before the tasks, above is an example of a text field question

 

What are its advantages?

This is the closest tool we have found so far to an actual usability test. Loop11 is one of the only sites we have found that really lets you create something similar to the usability tests we regularly carry out. You have control over the design of a proper test script which is the main reason we have found it to be so useful.

You don’t have to be there with the user (in the same room). Loop11 tests are un-moderated so all you have to do is design it, then set it live and spread the word by sending a link to your selected participants. Users can then complete the tasks in their own home at their own pace. Its main advantage over face to face usability tests is that it allows you to test as many users as you want.

Loop11 isn’t free, but it is cheaper than moderated usability testing. You also don’t have to spend much money on incentives as users are participating when they want, and where they want (we’ve written more about this on UXbooth). We still included a small incentive as a thank you for users spending their time completing the test, and as we made sure the test itself wasn’t long it worked out rather well for everyone.

You can test Loop11 on any device that browses the internet. We haven’t tested this but we would assume that in most cases this would be correct as Loop11 uses the browser and is fully online so it shouldn’t matter too much what device you use it on.

View ready made reports on participants completing your test. After you have set the test live and participants start completing it you can view reports which show how many people succeeded in completing tasks, how long it took them, and how many pages they went through to get there. This information comes in 3 exportable formats (Excel XML, PDF, CSV, and Excel XML with participants) so that you have access to the original data to do with as you please. The PDF option also exports the report version which includes graphically presented results for the overall test, and for each task itself, however we found that the excel document of raw data was the most useful as it allowed us to work with the data to produce a report with the information we required. We could then brand it and use it within our projects.

 
Sample Report

A sample version of the PDF report that Loop11 exports after a test is finished

 

What are its limitations?

Once a test is published there is no going back. You can’t edit the test after publishing it; you can only delete it altogether, so you better get it right the first time! We would recommend thinking carefully about the order of your test before adding them to the test list. Re-test, and double check as many times as possible in preview mode before putting it live.

Judging the success or failure rate can be tricky. The site tests live websites so for each task you have to set the URL you want users to start from. In order to track success and fail tracking you need to add URLs which specify the pages users reach which you consider a to count as a success. This can be problematic if your task can be completed in a variety of different ways. If you don’t anticipate them all, you could record a failure even if they succeed.

Tasks need to be carefully designed. The design of each task becomes critical when doing an un-moderated test especially with this tool as it needs to be much less fluid than a typical face to face usability test where normally we would deliberately allow users the freedom to complete the task as they would naturally. With Loop11 you are forced to be more quantitative with your approach to get a more defined fail/success criterion. Therefore we found that the tool forced us to design tasks to be much more basic and definite.  For example, in a face to face test we might ask users to define the problem they typically face and then show us how they would solve that problem using the site. With Loop 11 we would design the task to answer a very specific question we know they can find an answer to on a specific page.

Slow loading times on some websites. Sometimes the website you may want to use will perform slowly which could affect how likely people are to complete the test. We also noticed it crashed on a few sites too. We recommend checking how well a site performs in the preview mode before you commit to purchase. Loop11 do provide a piece of JavaScript that you can insert into the body text of the site to enhance the speed. Unfortunately we couldn’t do this on the site we were using which is a drawback if you are testing a site without access to the backend.

You might get more people participating than you bargained for. One quite costly drawback, especially for those of us who incentivise participants, is that despite setting a limit for the number of tests it only cuts them off after that number have completed the form. However, it cannot then stop people who have already started the test whilst the 20th (in this example) person was completing it. Therefore you may end up paying out for slightly more people than you originally wanted (we set a limit of 20 but had to pay 24 participants).

You don’t know who really completed the test properly. Probably the most obvious limitation when it comes to un-moderated testing is. Obviously you would like to think that anyone who sat down to do the test would want to do it properly, but you will get those who just rush through the tasks and questions to gain the incentive at the end. Normally you can look at the results and usually spot the person who you think didn’t complete it properly, the people who took the shortest time to complete their tasks and didn’t write any comments. The question you then have to ask yourself is whether you discount them from the testing?
 

When we recommend using it

If you’re looking for a quick, high level view of how a website is doing then this would be a good solution for you. It also helps if you are having problems testing users who are geographically spread or too time poor to meet one to one, by using Loop11 you can overcome this by testing anywhere that has internet accessibility anytime they want. Due to Loop11’s nature it is a useful tool for benchmarking a site, finding out where users are running into an issue to then carrying out some face to face usability testing to find out exactly where the problem lies. The scenario where we are likely to use it is to monitor the usability of sites we have already worked on. However, as the tool has strict success/fail metrics it is only really suited to carefully designed tasks which have a clear answer.
 
See anything I’ve missed? Just let me know!

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Samantha Harvey

About Samantha Harvey

Sam recently graduated from Visual Communication. She joined our team in April 2011 and has been conducting user research and has been making sure our user interfaces follow good design principles. She's keen to point out our poor selection of fonts... er I mean typography (sorry Sam). Follow Samantha on twitter @samharvey_ux

TheClickTest.com – Our review

11 Nov

What is it?

The Click Test is a simple and quick test that allows you to upload an image (that image can contain one or two versions of a design for viewers to choose from) and ask the viewer a question which can be answered by them clicking on a certain part of the design. It’s part of a suite of tools offered by the UsabilityHub (we’ll review the other tools later).

A good example for how theclicktest.com would work is if you were to upload a screenshot of a homepage you are designing and asking the viewer to click on where they think they would find the contact details. You don’t have to just use it for web designs though, in the example below it is being used to determine which picture the majority thinks best represents the description.
 

Example of theclicktest.com

A screenshot of the tool in action where users are invited to click the option they feel works best

 

What are the advantages of using theclicktest.com?

The clicktest.com is simple and can be very useful when trying to answer basic questions about interface design. If you need a quick response to confirm a suspicion then the click test only takes a minute to load and is free if you collect ‘Karma points’ by completing a few of the other tests that members have uploaded i.e. you complete 2 random Click Tests on the site you earn 2 Karma points, which in turn allows you to have 2 people take your test. For those of us in a hurry, or who need a large amount of people to fill out the tests, there is also the option to purchase karma points or to simply email a URL to existing contacts.

The results start being collected instantly and due to the popular nature of the site and the random order tests are given to viewers, you are bound to start collecting results within the hour. Every time we have used the Click Test we have only had to wait a day to collect the desired amount of results we wanted.

The results come in three different forms; a plasma map, heat map, and a click map;
 

Snapshot of the result maps

A snapshot of the results for choosing a colour scheme (plasma, heat, & click test from left to right)

 
We found the click map the most useful as it shows clearly how many people clicked on what graph (in the above example), nonetheless the other maps are visually pleasing and with a higher volume of participants and a different sort of test could prove just as useful!

What are the limitations of using theclicktest.com?

Questions that are more complex might not be as reliable, as the tests are meant to be quick, and determining how much time someone spent reading the question and thinking about the answer is near impossible. This brings me to the main problem that the click test has, and in fact all small, short tests online. You cannot tell whether the answers you are being given are true as people may be participating in the test to earn Karma points and so not really taking notice of the test questions and just clicking anywhere to complete it.

Another issue that comes with allowing anyone to partake in your test is that other than people you can recruit yourself, you do not know whose opinion you are taking note of which may be detrimental to your design.

In addition, the tool only records one click, the last click a viewer makes on the page before finishing. This means the questions are limited as you need to only have one answer in order for the test to work. We found this out the hard way by uploading a test that required the viewer to click on the ‘two best colour schemes’. However when we checked to see how the results were doing we discovered that only one click had been recorded for each individual who had participated.

When should I use it?

So far we’ve found it to be a useful tool to get opinions on basic design elements such as colour scheme choice and chart style. But we haven’t relied upon this data alone. We’ve used it to guide decisions, and have then followed up with face to face user feedback before making a final call. By all means use it to test web designs and guide any other design decisions that will fit into a one click answer question, but don’t let it be the only result that you use to make a final decision.

If you can think of anything else you would like us to cover in future reviews please get in touch!

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Samantha Harvey

About Samantha Harvey

Sam recently graduated from Visual Communication. She joined our team in April 2011 and has been conducting user research and has been making sure our user interfaces follow good design principles. She's keen to point out our poor selection of fonts... er I mean typography (sorry Sam). Follow Samantha on twitter @samharvey_ux

How to deal with opinions about your website

3 Feb

Asking people what they think about your website can be a big mistake. Initially the opinions from the people you respect or from your customers appear to be really useful until you start to see that few people agree and most opinions contradict each other.

It is important to remember that getting people’s opinions do not reflect reality. Ultimately what people say they do, what they say they like or dislike, or what they say will influence them may not be true when they’re actually sat in front of a website using it to solve a problem. In our research with users, we’ve seen many people tell us they only use websites in a certain way, only to see them completely contradict themselves when they come to use a websites to complete a task. Often what influences us is not processed at a conscious level, so as humans we can be quite unreliable when predicting future behaviour or explaining our previous behaviour.

If you want to know how to improve your website there is no substitute for seeing real customers using your site. And when it comes to improving your website, focus on the logical factors rather than the emotive opinions.

There are of course times when you don’t ask for opinions but you receive feedback from friends, colleagues, customers, and peers. When you receive an opinion or comment about your site try not to engage with it emotionally, instead look at whether it is positive or negative and whether the opinion has valid reasons or justifications.

To help, use our categories below to determine what to do with any feedback you receive about your website:

Positive comment with no justification – This is the type of opinion you get from your Mum. They are saying nice things but it’s not anything meaningful to help you improve your website


Positive comment with good justification – This is useful. Think of what actions you have taken on the site to lead to this opinion and consider how you can maintain and transfer it to other areas of the site


Negative comment with good justification – This is useful. Think of what the likely causes are for this comment and investigate it further. If you receive similar comments over time start looking at your site analytics for possible trends and tell the person responsible for user experience to include this in the next usability test.


Negative comment with no justification – This is the type of comment you might get from someone who wants a reaction out of you. Typically this comes from a negative frame of mind and is unhelpful.

Opinions about your website are rarely helpful in helping you make improvements. When you do receive comments take time to strip them of their emotion and consider if they have real validity. Only when you have a number of comments highlighting a theme should you consider investigating further.

How do you deal with opinions about your website?

 

 

Liked this article?

Get more usability insights straight to your inbox

  • We promise no spam, just straight up great insights from our UX experts!

 

 

Damian Rees

About Damian Rees

Damian has worked as a usability and user experience consultant for over 13 years. He has worked in senior roles within companies like the BBC and National Air Traffic Services where he has researched and designed for users in a variety of different contexts including web applications, voice recognition, and air traffic control interfaces. Follow Damian on twitter @damianrees