While this might not be as strict (or serious?!) an evaluation technique as, say, heuristic analysis or usability testing, having two people act out the roles of the user and the user interface (UI) of system can, I think, be very revealing.
It’s a little bit like some of the more elaborate “paper prototyping” scenarios I’ve seen, but I first heard of this in a talk by Stephen P Anderson, at UXLx 2010. Perhaps the best thing to do is see Stephen describe it at another event in Norway.
Observing representative users carry out representative tasks using a system (for example, your data visualisation tool), and recording any issues that come up!
Ideally, there’s a participant (a user to test your system), facilitator (that might be you), and observer (let them take notes).
Dave Hamill’s usability testing tips & tricks (sketchnotes from NUX2)
As part of evaluation, critique gives you a structured way to discuss usability issues with the developer of a system. Aim to give and receive feedback in positive way. Be kind, though. You don’t need to put the recipient on the defensive (although that takes skill and practice).
Heuristic analysis provides a way of objectively evaluating your own work or that of others against a set of design principles (“heuristics”). While it is a great way to pick up issues early on, it is not a replacement for observing real people using your system. We’ll get to usability testing later on!
A nice little primer for Jakob Nielens’ 10 Usability Heuristics. This was created by João Machini and is used with permission
A matrix of tactics for getting usability issues fixed
On July 20th, I had the pleasure of giving a workshop at UX Bristol 2012 alongside Caroline Jarrett. We promised that we would share all the great ideas and recommendations that our participants generated. These were tactics for how to make sure that the usability ussues you find actually get fixed.
A big thank you to Steve Krug for allowing us to build on all the work he did with Caroline on this topic, and for letting us reuse his slides in our presentation.
The following is a matrix of those “lightbulb tactics“, with the four main usability testing phases we considered on one axis, and some apparent themes along the other. I produced those themes by doing a quick bit of affinity mapping of all the tactics… hopefully, they make sense!
A matrix of tactics for getting usability issues fixed. Click on it to see a bigger, annotated version!
It was great to hear Noah Iliinsky talk about how to apply a design process to data visualisation. He spent three hours with us, starting off with a short talk, and then guiding the audience through applying this design process to their data in a meaningful way.
A fuzzy iPhone photo of Noah Iliinsky sharing ideas during his workshop for EBI Interfaces
Anyone whom I work with will know that I’m often asking “What problem are you trying to solve?“, and this of course applies as well to data visualisation as it does system or interaction design. So it’s no surprise that I find Noah’s work really engaging and inspiring. He talks about understanding your reader (the audience… the “user” – their drivers and needs); understanding your data (its characteristics and dimensions, and the message within that you wish to convey); understanding the choices that you can make as you apply a design process to visualisation, choosing how to convey knowledge and enable action.
Triangulating these gets you most of the way to good data visualisation.
Some of you might have attended the most recent how-to workshop we ran on campus, looking at card sorting as a technique for organising information. If you’d like to follow up on that and learn more, Donna Spencer (@maadonna), author of the Rosenfeld Media book “Card Sorting – designin usable categories” has just updated her resoursces page for the topic.
Lots of great information, advice and articles.