Way back in 2001, I started to study usability as part of my MSc Multimedia and Education. I found it as fascinating as anything else I was studying at the time and in many ways more so. However, just like accessibility and inclusion, usability is often overlooked when planning learning activities and episodes that include any kind of technological intervention. Yet, it is essential to the ultimate success of any process that has an end user.
Of course, way back in 2001 web design was in it’s infancy and mistakes had to be made before usability was properly understood. In fact the process is only now becoming a feature of large organisation web design. One of my ‘good reads’ at the time was Ben Shneiderman – http://www.cs.umd.edu/~ben/ – for example: http://faculty.washington.edu/jtenenbg/courses/360/f04/sessions/schneidermanGoldenRules.html
Today, I travelled to Coventry where I had been invited by LSIS to attend a short workshop on User-Centred Design (UCD). LSIS are re-visiting the Excellence Gateway (EG) and I’m leading a small focus group of practitioners to advise (or otherwise inform) them on the use of social media (Web 2.0). The UCD workshop is part of their preparation for the future.
Andrew Lamb from DirectGov was speaking when I got there (delayed train!) and straight away I understood exactly where he was coming from. He was showing an example of a large ‘exemplar’ web site’s analytics (I missed which site it was), which showed how visitors to to the site moved around it. He suggested that although six pages had been ready for launch, they actually launched just two to start with. The analytics showed that these two pages had a 68% drop out by page two. This suggested that the design was wrong and that these and the further four pages needed a huge overall before launch. And this was despite the site being regarded as a great success. Andrew went on to show some other figures of how planned efficiencies can be lost if web sites lose just 1% of their predicted visitors.
We then had an input from the team that created Next Step from various other provisions. Vanessa Clynes told us how they had first built linked wire-frame versions (but not linked to live databases) for users to try before anyone in the ‘development’ side of the website became involved. The wire-frames were then user tested with knowledgeable practitioners, in this case careers advisors, to help take the design through to the next stage. As this pre-beta website developed, real users were invited to ‘use’ the site and comment before it was finally ready to be passed on to the web development team. Andrew Lamb intervened here to say that the knowledgeable practitioners (my description) often brought their own bias to the process and that the only real test should be with real users. This was accepted throughout the room and is I suppose where my small focus group comes in – to act as proxies for the real user evaluation until such a time as the new site is ready (a long way off). Having said that, we are not involved in design – just content.
Now of course, we are talking here about sites which have the potential to be visited by millions of users, but the fundamental message to be heard is ‘listen to your users’. We should all do this when building our VLEs, creating our presentations (PPT v Prezzi maybe) and more generally, when simply teaching. Vanessa also told us how they brainstormed six ‘types’ of user for their site. Each type represented a ‘segment’ of their audience and as a result, because there really is no ‘average’ user they had to develop a persona for each segment. The persona had an image, a job (or not), wife, kids etc., so that they could realise what each aspect of their audience required.
Could we do that same thing when planning to use technology in our teaching? Could we provide different ways for our users (learners) to access our teaching?