My Lists @ Disney’s Movies Anywhere

As the UX Researcher on the Movies Anywhere team at Disney, I led a two-part study including a contextual inquiry and an unmoderated usability test in May-June of 2021 to evaluate the first-time user experience of My Lists.

Note: Insights and deliverables are confidential, but please reach out if you have any questions.

How can we ensure the release of a new feature, My Lists, is successful?

Movies Anywhere is a platform that enables movie fans to purchase, store, and watch movies from various retailers and sync their movie collections across devices. This study evaluated the first-time user experience of a new feature called My Lists, which allows users to organize their movie collection with custom lists and automatically generated lists based on genres, franchises, people, and themes (see Figure 1).

Figure 1. My Lists landing page

I worked with the product, design, and marketing teams to define the goals of the study. The marketing team wanted to know if marketing messaging was effective in creating interest in My Lists, while the product and design teams wanted to know if there were any major pain points to fix before the feature was to be released in a month.

I first investigated users' reactions to marketing communication, impressions of the feature using their personal Movies Anywhere accounts, and overall usability of the feature. In a subsequent usability test, I evaluated the impact of design updates based on my recommendations from the first test.

Part 1: Contextual Inquiry

In a remote study, I observed 5 participants as they explored marketing messaging and My Lists in their personal accounts.

I chose to do a modified contextual inquiry to observe how participants interacted with My Lists without prompting and in a slightly more natural context. Though the study was remote, I observed participants while they explored marketing messaging and My Lists in their personal accounts while sharing their screens. I worked with a product manager who observed sessions and turned on the feature in participants' accounts for the duration of the session. This allowed participants to make and view lists with their own movies and give feedback on the auto-generated lists. I then followed up with questions about what they did and asked them to interact with any features within My Lists that they had not yet discovered.

5 participants were recruited through UserTesting. I screened for current Movies Anywhere users with active accounts that contain at least 20 movies to ensure that participants had enough movies to make lists with.

Methods

I identified some areas where participants struggled and prioritized pain points based on severity and frequency.

To begin the analysis of the contextual inquiry, I determined which features participants interacted with on their own and which they required prompting to find. This helped us understand which features were more discoverable. I also watched participant interactions with each feature within My Lists and took note of where participants struggled and where they succeeded in using the feature. I then took an inductive approach to thematic analysis to uncover patterns in participant reactions to marketing messaging and impressions of the feature. Finally, I prioritized pain points based on frequency and severity.

Analysis

Marketing messaging generated interest in My Lists, but there were usability issues, like confusing button names and icons. I recommended changing these button names and icons, which were tested in a subsequent usability test.

Themes from the contextual inquiry indicated that marketing messaging was effective in creating interest in My Lists. I also uncovered usability pain points based on common places where participants struggled to use My Lists. For example, some button names and icons were confusing to participants because they did not match their mental models and they struggled to edit their lists.

I presented findings and actionable recommendations to designers and product managers. I proposed some simple changes to address the pain points (e.g. changing icons and renaming buttons to be more intuitive) that could be made quickly but still have a positive impact, as well as larger changes (e.g. redesigning the workflow to edit lists) that would require a larger investment, but would have a larger impact on usability. Ultimately, we decided to implement the simple changes because we had limited time to make updates before the feature was released.

Findings & Impact

Part 2: Usability Testing

I conducted an unmoderated usability test with 5 Movies Anywhere users to validate design updates aimed at improving the intuitiveness of button names and icons.

After the first part of the study was completed and changes to the design were made, I followed up with an unmoderated usability test with 5 Movies Anywhere users (different participants from the first test) to validate that design updates improved usability. I worked with a designer to create a prototype featuring the design updates. This test focused on evaluating the intuitiveness of icons and button names that had been changed based on findings from the first part of the study.

Methods

I evaluated task completion rates and user feedback to determine if design updates improved usability.

I organized raw data into a spreadsheet and determined the task completion rate for each task. I also identified themes that emerged during the think-aloud protocol. Design updates were deemed to be usable if the task completion rate was high and participants did not express confusion during the task.

Analysis

Design updates were successful in creating a more intuitive experience.

This test quickly validated that the simple changes to icons and button names made it easier for participants to use My Lists. These updates were made in the end product (see Figure 2) before it was released.

Findings & Impact

Figure 2. Live state of My Lists as of September 2021

If I were to do this again, I’d get involved earlier to proactively address usability issues and benchmark metrics before and after design updates.

By the time I started on the team and took on this project, My Lists was already in the development phase. Ideally, I would have liked to conduct usability testing much earlier. For future projects, I got involved as the UX researcher earlier in order to identify and fix pain points well before product release.

If I were to run this study again, I would measure task completion rates for key use cases before and after the design updates made in part 2, so that we could benchmark changes against the old designs to more definitively determine if design updates were successful in improving usability.

Reflection

Previous
Previous

Personas @ Disney's Movies Anywhere

Next
Next

Case Study: Evaluating Builtin.com