VNET5 Logo

User-Centred Product Creation in Interactive Electronic Publishing

home   |    events   |    approach   |    support   |    resources   |    VNET5 project   |    sitemap
 
SUCCESS STORIES:
 
1
 
2
 
3
 
4
 
5
 



Optimized for Microsoft Internet Explorer

Just-in-time coaching

How VNET5 coaching helped a project to improve its user focus and achieve successful user trials at the product launch

by Miles Macleod, Performance By Design Ltd, January 2003

The background: a complex product for non-technical users

The team developing a system for managing creative rights recognised usability as a critical success factor. They wanted to assure good usability in the product late in the development cycle, quickly and with minimum disruption.

The product is for non-technical users: film distributors, producers, and creatives in the industry. Success depends upon these people finding the system usable. The project had a budget for user trials, but their overriding priority so far had been to deliver a working system meeting the functional specification. Now there was a need for some rapid user-centred actions.

Understand the issues, plan the actions

As a first step, four project team members attended a VNET5 user-centred product creation workshop. This built understanding of what could be done, and allowed the project to see how others had tackled similar situations.

It also helped the team develop their plans for user validation, summarising the qualities the system would need, to make it usable for each type of user, and what should be done to assure that the system would achieve those qualities.

The point of need

The project had an immovable deadline of a public launch at the Venice Film Festival. They had plans for public user trials at the launch. Before going public, it is good user-centred practice to have one or more iterations of user testing and feedback into design. This helps iron out initial usability problems in private, and allows the design to be refined. Here there was not time. But it was essential to manage the risk of show-stopping usability issues being discovered at the launch.

Initial 'quick and dirty' usability walkthrough

Many projects find themselves in this kind of position, and have to do some quick usability testing without users. Here we used 'expert walkthrough' of key scenarios of use (identified in the validation planning). This is particularly effective when the evaluator has a lot of experience of watching real users try to use new systems, and can anticipate user problems.

We found a number of moderate usability issues, some immediately actionable, some to be investigated further in the real user testing.

Demonstration and real user testing?

The second big challenge was in designing the real user testing: how to combine demonstration with some worthwhile user validation. Demonstration and validation pull in different directions. We needed to get a fair picture from the users' viewpoint of the reality of the product: what is good and bad, easy and difficult, liked and disliked, to assess both user performance and user satisfaction against the quality of use success criteria.

Observing user performance

The solution was to give an initial demonstration of what the system can do, and then allow people to explore using it without interference or guidance unless they became totally stuck - all the while observing and noting their comments, issues and problems.

This would give data on the usability of the system for people at the important stage of getting to know their way around it. Other testing would be done in later field trials covering usability for people starting without the demonstration, and usability for expert users.

It is all too easy for project teams to underestimate the learning curve for new users, so here it was important for them to look for things users did not at first understand. All project team members present at the Film Festival launch were briefed and rehearsed in the demonstration and observation protocols, and also in interviewing users and assessing user satisfaction.

Assessing user satisfaction

User satisfaction assessment used a questionnaire and a debrief interview. VNET5 had previously given the team guidance in developing a practical questionnaire that asked the right questions - relating to the quality of use criteria and critical success factors in the validation plan. This was piloted and refined for use in the trials.

It included subjective ratings to give some baseline measurements, and enable comparison across different user types. It also included sufficient questions about the user to enable findings to be related to user characteristics.

The outcomes

The demonstrations went successfully, and there were valuable usability findings. People strongly welcomed the potential of the system. The team collected numerous user comments, and aspects of use that demonstrators had observed, including things that could substantially enhance user satisfaction and performance.

VNET5 coaching helped the team analyse and interpret the user trial results. Some of these confirmed things noted in the walkthrough, others were new. Findings ranged from general comments (e.g. 'too much text'), to specific views on details of the design.

Sufficient people of each of the three main user types participated to enable some statistical analysis of the user satisfaction findings. This showed variations between the user groups, that would be helpful for refining the design.

The essential thing in this kind of evaluation is to prioritise the findings: over-long lists of 'usability bugs' can be counterproductive. Here there were specific actionable findings that would help improve the next version. When the key things to be improved are understood and agreed, then there is a better chance they will be acted upon.

Example of improvements: Simple search

Originally it was difficult to understand and use the simple search function. (see. Fig. 1): (1) The naming of search options, especially "products" and "brands", were not understood. (2) The buttons (white capital letters on black background) are difficult to read and sometimes not noticed. (3) The steps 1. to 2. in the search criteria are presented badly. It remains unclear what the minimal entry is. (4) Unclear wording and terms are used (e.g. select "only internet locations"). (5) The example of a n entry does not show any possibilities of abbreviation or how to use wildcards. (6) The main categories are presented as if the were links to external sites.