Once the conference schedule became available, the workshop on usability immediately caught my eye.
In our company, we haven’t had a usability expert at work. The creation of screens and several input forms is flexible as we have a built-in editor for this. It’s part of the engineer’s task to create and design the “look” of the components of the product. The GUI design lacks consistency, and even if a pattern has been established, it is not followed or the details are “off”.
What I was looking for in this workshop led by Hegle Sarapuu was learning some basics about how to carry out usability testing, what are the dos and the dont’s, and where to start at all. In that respect, the workshop fulfilled my expectations, and I had plenty of food for thought later. Even though the practical part took up just a small part of the workshop (thus, it was more like a track that included a practical exercise), I was very satisfied with it because I felt like a received a good starter kit that I could use at work right away. Next I’ll cover some of the useful bits from the workshop and how I have already applied what I learned.
The opportunity to test a redesigned part of the product came soon enough. This part of the application is used for data entry, so the flow and the fluency of steps is very important. The workflow used to be distributed throughout different screens that popped up but now most of the steps were consolidated into one large screen. Also, the flow had to work both for keyboard and mouse users.
Hegle showed an example of usability testing where the tester’s reactions were recorded along with their comments. In my context, recording the facial reactions would probably be an overkill since I used my own team members (two testers and a technical writer) for usability testing. However, I asked them to create recordings of their short test session and also record the audio with their comments. Luckily we have built-in screen recording functionality in our product, so I didn’t have to use any special software for that.
The recordings were a couple of minutes long but they provided a lot of information about the steps that were confusing or they misunderstood. This was great first-hand feedback. Also, I could later go back and compare the recordings of earlier sessions to those recorded after improvements had been made.
Usability Characteristics and Tasks
For me the usability characteristics (learnability, efficiency, memorability, errors, satisfaction) gave a good way to focus the usability testing. Since this part of the product is concerned with financial information, reducing user errors, making it easy to learn, and efficient to use were the most important characteristics to focus on. Knowing what is important for the target audience helps to figure out which characteristics matter the most in your context.
As Hegle had suggested, it is beneficial to have the usability testers do specific tasks that focus on a specific aspect. Each tester was already familiar with the older version of the flow. In case of medical billing software, it hardly makes sense to pull someone in from the street. So previous knowledge about and experience with the workflow was a prerequisite to choosing the testers. I had three general methods for going through the workflow: using only mouse, using only keyboard, using both mouse and keyboard. I combined these methods with 2 types of product specific flows (the details were slightly different), so I had 6 tasks in the end. I hoped these tasks would highlight the problem areas the best.
The results were great. We got a bunch of suggestions for changing the layout, order of steps, moving focus through fields, navigation, field descriptions, etc. We also found bugs that would have affected either keyboard users or mouse users separately. Some development tasks came out of this as we found that we need to change some aspects of the application’s behavior.
Another benefit of using “internal customers” for usability testing was that they could compare the patterns and behaviors they have learned to the redesigned part of the product.
Once we had smoothed it out, the new workflow and screens were presented to some customers for additional feedback. They were able to contribute ideas based on their business context. So we made those changes and went through another round of usability testing. We also did some testing of specific areas independent of the original tasks.
All in all, we got very positive feedback on our work. We worked closely with the CEO on this one and he brought this project up in meetings and cast it in positive light. So not only were we able to help with making the product better but we garnered positive attention to (usability) testing.
I hope to use this success as an argument for doing more for usability in the future.
Link to Hegle’s slides (I didn’t want to retell everything): http://nordictestingdays.eu/2012/uploads/Presentations/Practical%20guide%20to%20usability%20testing%20-%20Hegle%20Sarapuu.pdf