Chrome iOS App Integrations @ Google
As a graduate UX Research Intern at Google on the Chrome browser team, I led this evaluative usability study over 3 weeks in August 2022. I collaborated with other researchers, product managers, and designers.
Note: Insights and deliverables are confidential, but please reach out if you have any questions.
What if you didn’t have to switch between apps when looking up directions, translating text, or creating calendar events?
On the Chrome for iOS team, we wanted to make this a reality. I led research to evaluate the discoverability and usability of three entry point concepts for a new feature that would allow the Chrome for iOS app to integrate with other Google apps like Maps, Translate, and Calendar.
This study focused on integration with Calendar, allowing users to seamlessly create calendar events without leaving the Chrome app.
Stakeholders needed to decide which of the three entry points to proceed with and if we needed to create a user education pop-up to show users how to use the feature.
See Figure 1 for a quick summary of this study and keep reading to learn about my process!
Figure 1. Project summary created in Figma
I ran an unmoderated remote usability test with 6 conditions and 31 participants to evaluate each entry point with and without user education.
I met with stakeholders to understand what they needed to learn from this study and translated their needs into three research questions:
Which entry point (A, B, or C) is the most discoverable and usable?
Does user education improve the discoverability of the entry point?
Which entry point (A, B, or C) do participants prefer?
I then designed a usability test to answer these questions. I opted for an unmoderated approach to see if participants could find and use the entry points without any guidance or intervention from a moderator. This method also enabled me to gather more data while working within a tight three-week deadline with limited overlapping working hours with the designer in France.
I implemented a 2x3 factorial design to test each entry point with and without a user education pop-up (see Table 1) and worked with the designer to create prototypes for each condition. 31 participants, approximately 5 per condition, were recruited through UserTesting with screener questions that targeted recent Chrome iOS users.
Methods
I compared quantitative and qualitative data across conditions to figure out which entry point was the easiest and most enjoyable to use. I also compared each entry point with and without user education to determine if the user education pop-up improved discoverability and usability.
To determine which entry point was the most discoverable and usable, I compared the following metrics across conditions:
task completion rates
average time it took participants to find the entry point
median single ease question (SEQ) scores
frequency of alternate entry point attempts
A more usable entry point had a higher task completion rate, quicker discovery time, higher SEQ score, and fewer alternate attempts. I synthesized these quantitative metrics with qualitative themes from a think-aloud protocol to determine which entry point was the easiest and most enjoyable to use.
To see if we needed a user education pop-up, I compared task completion rates with and without the user education pop-up for each entry point. I also asked participants to rate how helpful the pop-up was on a 7-point Likert scale. The user education pop-up was determined to improve entry point discoverability if task completion rates were higher with the pop-up and participants found it helpful. Again, I synthesized the quantitative metrics with qualitative themes that arose from participant reactions to the user education pop-up. I also included a manipulation check to determine if participants consciously noticed the pop-up.
Analysis
One entry point stood out as the most discoverable and usable, but user education was not helpful. We iterated upon this entry point to come to the released design.
One entry point was clearly the winner; it had the highest task completion rate and lowest discovery time, and participants appreciated its ease of use. But surprisingly, user education backfired. It confused participants and did not increase the discoverability of any entry points because participants often mistook the user education pop-up for the entry point itself. I recommended that we proceed with the most discoverable entry point without user education. This solution would be the most intuitive for our users and would save the design and engineering teams time and resources by avoiding needlessly creating a user education pop-up.
I began socializing findings during the study through daily updates and preliminary insight sneak peeks sent to designers, product managers, and other researchers via Slack and email. The final deliverable was a report deck and presentation. I included video clips and direct quotes in the report to give stakeholders more context and allow them to directly see how participants interacted with each entry point.
Ultimately, the design team implemented my recommendations; they were able to move forward with one entry point and did not spend extra time creating a user education pop-up. The released feature is an iteration of the concepts tested in this study (see Figure 2). It allows users to press and hold on addresses, dates, and text to seamlessly get directions, create calendar events, or translate.
Findings & Impact
Figure 2. Chrome for iOS integration with Calendar as of July 2023
If I had more time, I would have liked to evaluate the entry point in other scenarios and run statistics to more definitively determine which entry point was the most discoverable.
One limitation of this study was that it was a testing environment and some participants mentioned that they may not have found the entry points if they weren’t really looking for them. Ideally, I would like to do more research to validate that this entry point would still perform well in other scenarios and for different applications of the new feature (e.g. Chrome integration with Maps and Translate). While my internship ended after this study, I designed the research plan in such a way that the team could reuse this same test to evaluate other entry points and/or future applications of the new feature.
There was significant time pressure as the end of my internship was approaching, but if I were to do this study again, I would have liked to run an ANOVA to test if the average time it took participants to discover each entry point varied based on the entry point and the presence of user education or run a regression to determine if the entry point and presence of user education predicted discovery time. Ultimately, that level of rigor was not needed for this study because it was clear from the data that one entry point stood out as the most discoverable. This was enough to iterate on the current design and lay the foundation for future design updates and testing.
Reflection