Usability Evaluations

at The Polis Center

Conducted Usability Testing for 4 third party web information systems and data dashboards. Prioritized and implemented Design recommendations accordingly in Agile Usability Testing Lifecycles.

IMGBIN_usability-testing-software-testing-png_JzbsWheA

I conducted Usability evaluations for 4 different web based Information systems and Data Dashboards hi-fidelity prototypes or websites that were developed by The Polis Center. Since these are third party websites I cannot show the complete process artifacts, but below I will talk about my process and key takeaways from them.

As the sole UX Researcher on team, I had a great opportunity to push for process changes and implementing new ways to conduct usability evaluations.

Team

Me - UX Researcher and Designer
Matt Nowlin - Supervisor - Information Designer
Robert Ferrel - Sr Programmer

Deliverables

Conducting Survey, Contextual Interviews and focus group meetings, Wireframes, Hi-fidelity Prototypes, Usability testing Plan, Usabilty Testing Protocol, Survey Questionnaire, Usability Testing Report, Improvements and suggestions, Implementing the suggestions

My Role

1. Planning and Script Writing

As the first sole UX team member, I strived to introduce and improve methods of conducting User Testing.

I improved existing surveys by creating them task based and/or using visual cues in survey questions. These improved easy to respond and yet comprehensive feature based feedback surveys were highly appreciated by stakeholders.

I introducted and created scripts for semi-structured contextual interviews, planning sessions to determine goals and questions for interviews, dry runs. These helped immensely in streamlinig the process and avoid deviations during interviews.

2. Conducting User Testing

I planned and co-ordinated user testing. I created, improved and released the usability surveys.

After initial guidance, I independently conducted contextual interviews.

3. Customer Insights and Ideation

Along with my supervisor, converted user and stakeholder insights and observations into features.

Process

Professional profile (7)

Learnings and Key Takeaways

1. Script is for support, improvise according to user's context and task

This was a major takeaway, before every usability I prepared in my head to possible modify script and flow on the go. Every user comes to a data dashboard with a different goal and context of use, like a journalist was someone who was exploring in a fast paced manner for an article, while someone writing a report on particular neighborhood. Both come in to the tool with different input, output expectations and working efficiency. Its important to modify the context of usability testing according to their regular tasks to capture the right feedback.

2. Iterate Quickly

If first 2 tests of 5 have already shown the same problem, reflect if it is a issue that will be faced by others too and if yes the start resolving it quickly. I saw this to be helpful to avoid the next other tests to focus on highlighting the same issue repeatedly while we could have extracted more valuable feedback instead.

3. Involve the team

Collaborating proves to be very efficient when working in agile. Especially collaborating with Developers to get quick feedback of implementations and timelines.

4. Analyze ASAP

Analyze data, notes and observations immediately after the session with user while everything is fresh in mind. Even though there are audio or video recordings, there is contextual information that you observed or noted during the session, its best to analyze and discuss as soon as possible and document it. Generally data like common usability issues, pain points, and environmental factors can be observed and noted.

5. Prompt for explanation without leading

As a newbie in the field I initially felt awkward silences were bad. But over time I learnt that they are good, and only to prompt user to tell what are they thinking or doing and not fill it with long questions or talk.
I learnt to carefully word prompts and tasks as well. Since script has an overall task already, I started preparing a list of prompts into my script that I would need based on experiences. These prompts ensured I did not use leading words.

6. Note immediately after session if conducting tests alone

When conducting tests alone it is very difficult to moderate and note simultaneously.  So facilitate this firstly I made my script form like with space after each step and question to take notes on it itself immediately. Secondly, right after the interview I sat down in lobby to make notes of the session so as to not forget any details. As soon as I reached my desk back I again reviewed and made notes as the session was fresh in my mind. This helped me in preserving minutest of contextual information as much as possible and not having to do guess work by listening to audio recordings.

7. Create or collate your own Heuristics for Data Visualizations

Although there are standard existing Heuristics for Evaluation like Schniederman's or Nielsen's, Different data visualizations and dashboards require a custom set of Heuristics. I researched such different heuristics in scholarly articles and also discussed with my supervisor for any new heuristics and collated them to conduct Heuristic Evaluations.

8. Quick Critiques

Asking for critiques, even quick 5-15 minutes look, helps identify issues that could snowball into larger problems down the line. It also helps identify new possible alternative solutions.

You can connect with me at nea.patil.02@gmail.com
Updated: 2024

linkedin