Case Study

Concept Test

Code for America

Contract, Lead UX Writer

Summary

I designed a survey to evaluate users’ perceptions of our revised navigation labels.

After a tree test showed issues with our website's information architecture (IA), I worked with the research team to improve our survey for a follow-up study to make it more straightforward and accessible. We wanted to find better-suited terms for our navigation labels and understand the reasons behind participant preferences. To do this, I revised the survey to ensure clarity and simplicity. The results were clear: participants strongly preferred my alternative labels. Based on this feedback, we updated the IA to match user expectations, enhancing website navigability, SEO, user understanding, and inclusivity.

Situation

Because the tree test revealed issues with our information architecture (IA), I collaborated with the research team to refine our survey methodology for a follow-up study. We aimed to make the survey as accessible and understandable as possible to a diverse participant group. This effort was part of an initiative to improve the website's usability by aligning its navigation labels with user expectations and natural language.

Task

My goals were to identify and validate better-suited terms for concept labels within the IA and to explore whether participants had more appropriate suggestions than those we had proposed. We sought to gather context and reasoning behind participant choices to enrich our quantitative data with qualitative insights.

Actions

  1. I worked with the research team to revise the survey wording and design, ensuring universality, clarity, rigor, and simplicity in questions and response options.

  2. I included an "other" option for each question to accommodate responses that might not fit our predefined options, allowing for a broader range of feedback.

  3. I added an optional free-response box inviting participants to elaborate on their choices, aiming to capture the reasoning behind their selections and gain deeper insights into their preferences.

Results

The survey revealed a clear preference for the alternative navigation labels I had proposed (in orange on the charts below), with 0% of participants opting for "Research Papers" or "Data Novices," terms previously used in the IA. This aligned my proposed labels with the participants' mental models and applied language.

Outcomes

Based on these findings, we updated the website's IA to incorporate the new labels, significantly improving its alignment with users' expectations. This update made the architecture:

  1. More aligned with users' mental models, enhancing navigability.

  2. Search engine optimized (SEO).

  3. Easier for users to understand, increasing trust and retention.

  4. More inclusive, broadening the website's appeal to a broader range of potential users.

The successful incorporation of participant feedback into the IA demonstrated the value of involving users directly in the development process, leading to a more user-friendly and effective website.

Previous
Previous

Code for America: Tree Test Data Analysis

Next
Next

Code for America: Website Usability Test