Case Study

MindTap 2.0 Beta

Cengage Learning

Freelance, User Researcher

Summary

I evaluated the beta version of Cengage's 2nd edition MindTap e-learning platform, identifying usability issues through user interviews, surveys, and usability testing. Critical problems included non-intuitive drag-and-drop mechanics, dense content design, and confusing platform organization. Collaborating with Cengage's design and development teams, we improved the mechanics by adding tutorials, redesigned the content into digestible sections with clear headings, and reorganized the platform for better navigation with a customizable dashboard. These enhancements significantly boosted user satisfaction, engagement, and learning outcomes.

Situation

I worked with Cengage to evaluate the beta version of the 2nd edition MindTap e-learning platform. The platform aimed to enhance educational experiences through interactive modules; however, student user feedback indicated feature usability problems, inhibiting user satisfaction and learning outcomes. A conversation with an instructional designer indicated issues surrounding drag-and-drop mechanics, content design, and platform organization.

Tasks

Given the feedback and insights from student users and instructional designers, my job was to dig deeper into the identified issues surrounding the MindTap e-learning platform. Specifically, I needed to assess features' usability, drag-and-drop mechanics' effectiveness, content design, and overall platform organization to propose actionable improvements.

Actions

  1. User Interviews and Surveys: I interviewed students and instructional designers to gather qualitative data on their experiences with the platform. I also sent surveys to a broader audience to quantify the extent of the usability issues.

  2. Usability Testing: Next, I organized a series of usability testing sessions focusing on the problematic features, especially the drag-and-drop mechanics and content navigation. These sessions were eye-opening, revealing where users struggled the most.

  3. Analyzed Platform Data: I dove into the platform's usage data to understand how these issues affected learning outcomes and user engagement. This analysis helped correlate user feedback with actual behavior on the platform.

  4. Collaborated with the Design and Development Teams: Armed with insights, I worked closely with Cengage’s design and development teams. We brainstormed solutions, keeping user needs at the forefront of our discussions.

Results

The research uncovered that the drag-and-drop mechanics were not as intuitive as intended, often leading to user frustration rather than engagement. The content design was too dense, making it hard for users to digest information effectively. The overall platform organization also needed a more explicit hierarchy, confusing users about where to find necessary resources.

Outcomes

  1. Revamped Drag-and-Drop Mechanics: We simplified these mechanics, making them more intuitive and less prone to errors. We also introduced a brief tutorial for first-time users to help them use these features.

  2. Redesigned Content Layouts: The content was broken down into smaller, more digestible chunks, with clear headings and summaries to aid comprehension. Interactive elements were strategically placed to reinforce learning without overwhelming the users.

  3. Reorganized Platform Structure: The navigation was overhauled for clarity, introducing a more logical flow that users could easily follow. A customizable dashboard was also introduced, allowing users to pin their most used resources for quick access.

    Following these changes, subsequent user feedback and platform analytics indicated a significant improvement in user satisfaction, engagement levels, and overall learning outcomes.

Next
Next

Pearson: MyPsychLab