Following on from Friday’s post about the planning and requirements phase of our project to re-invent the GoodPractice user experience, the design phase was where it all started to come together and a clearer picture emerged of how the finished product will look. There were two work-streams in our design phase:

  • the information architecture review, and
  • prototyping

Both of these terms, I think, are deliberately constructed to give an impression of mystery behind the process. Information architecture is simply the way that content is organised, most often into various levels of categories. The prototyping process is a series of activities aimed at producing a mock-up of how the finished product will look.

We spent quite a bit of time in the design phase making sure that we were getting things right before embarking on the technical development work. This was especially important to us since the end result is a departure from our previous toolkit designs.

Redefining the information architecture

GoodPractice have been in business for almost ten years now and during that time we’ve gone from having a single product, aimed at HR and learning professionals, to a range of different client solutions aimed at a range of different audiences with different needs. Over that time we’ve learned some interesting lessons about how certain ways of categorising our content can lead to a better user experience.

This time around, we decided to get two reviews of our content. One by an expert information designer from our partners at Storm ID, and another by the users themselves in a card sorting exercise.

Card sorting is used by content providers all the time to find new and better ways of presenting their offering to their users. From Amazon categorising its products, to the major news sites categorising their features and articles, it’s a well established method of getting a valuable insight into how users think about content and look for it.

From these two exercises we got a few very clear recommendations for reorganising our content:

  • clearer, more direct labelling
  • smaller chunks
  • better ways of filtering the content
  • easier ways to find similar content

Since then, our Editorial team have reviewed every single piece of content we have and restructured our products  to accommodate the feedback we received. Even after we introduce our new look toolkits, we’ll be constantly reviewing the feedback we receive to improve on this vital part of our users’ experience.

Low fidelity prototyping

My favourite phrase of the whole process: low-fidelity prototyping. Which was just Craig and Shelagh’s (our wonderful designers from Storm ID) fancy way of saying drawing pictures on flip charts.

We knew what features and functions we wanted, but we now needed to get a clearer idea of how these would work. Craig and Shelagh drew pictures. Lots and lots of pictures.

Following the low fidelity prototypes, we very quickly settled on a design that seemed to work best from all our user personas’ perspectives. We could see Algie, Harry and Bertha all using the site in ways that suited their own personal internet habits. Craig and Shelagh then worked up the sketches into wireframes, basic visual guides to the whole interface design of the product.

The wireframe is a very basic conceptualisation of the finished product. Its main purpose is to check that all the navigation and supplementary elements of the page are in the right place and named properly. No images are used, and no colour is present so that designers can get an idea of how colour needs to be used to draw the users’ eye. As an example, our wireframe for the homepage looked like this:

The next step was to find out exactly what our users would think about the new designs.

User testing and eye-tracking

The ultimate test of a new design is to put it in front of the people that use your products and find out what they think. Our web designer, Rob, had created an online, interactive version of the designs that Craig and Shelagh produced. It wasn’t a fully functioning prototype, but it was a functioning website with some content placed inside example toolkits. The idea was to get a feel for what people thought about the new look, how they used the site, and whether the new information architecture made sense.

We carried out usability testing with 10 people: some were regular users of our toolkits, while others had never seen them before; some were ‘Algie’ type users, while some were ‘Harry’ type users. They were greeted by an independent consultant and given a series of tasks on the prototype website. We got their feedback through their comments but also through eye-tracking so we could tell exactly what they were looking at.

The results were fascinating and we were certainly glad that we carried out the testing. Despite it being a regular feature of many sites, we found that having any navigation on the right hand side of the screen was almost completely ignored by our users. We also got great feedback about some of the language used on the site for certain features, especially where it was ambiguous or overly ‘web 2.0’. Finally, and most importantly, we got some very useful insights into how to best use colour on the sites to highlight certain navigation and search elements which we will now be able to pass onto our clients when designing their sites.

The polishing

With all the feedback we had, Craig and Shelagh were able to polish up the designs with choices of colour, typography, icons and other graphical and UI devices. When they were finished, we had something that looked like this.

All that had to be done now was turn the design into a fully functioning web application, a process I’ll cover in a post tomorrow ahead of our official launch at the CIPD HRD conference on Wednesday.

Other posts in this series

This is the third in a series of posts describing how we approached the redesign of our online toolkits for managers and leaders. The other posts in the series are:

1: Re-inventing the GoodPractice User Experience

2: User Experience: Planning and Requirements

4: User Experience: Implementation, Testing and Measurement