AMAs
September 28, 2023

Andy Warr on guaranteeing research impact

Andy Warr, Research and Insights leader, joined the Rally team on September 28, to discuss guaranteeing research impact. Andy covered topics like strategies for guaranteeing research, communicating research impact and insights, balancing competing priorities, and more. 

If you missed it, or want to revisit the highlights, read our recap below. If you’d like to watch a recording of the full AMA, follow this link.

Key takeaways 

🔑 Research is vital for informed decision-making, where the basic unit of impact is allowing a decision, and the ultimate goal is shaping a comprehensive strategy.

🔑 Embracing adaptability in research practices and taking initiative, even beyond traditional roles, is essential. Frequent iteration and taking early actions can lead to impactful outcomes.

🔑 Measuring UX Research impact remains a challenging domain. Key strategies involve identifying decisions to be made and documenting results of research on those decisions.

🔑 Competing priorities require a structured approach. Roadmap planning involves working with cross-functional partners to draft a prioritization list. Clear boundaries should be set on what can be executed, and unresolved discrepancies may need escalation to higher management.

🔑 Evaluating the value of research work can focus on identifying new opportunities, changing direction, stopping certain actions, or increasing confidence in decisions.

🔑 To create an impact amidst changing goals, researchers should provide data-driven, customer-centric decisions and focus on future building, not just the immediate quarter.

Who is Andy?

Andy is currently the Director of Portfolio Research and Insights at Dropbox, a team that includes customer and market research, as well as Research Ops. 

Andy’s career is layered with impactful research. Here’s a glimpse into some of impact he and his teams have had:

  • Microsoft: Spearheaded a pioneering effort to automate the process of provisioning remote usability testing at a time when such concepts and tools didn’t exist.
  • Google: Played a pivotal role in launching and patenting a tab and window manager for Chrome OS, which was empirically shown to improve the efficiency of window management over competitors.
  • Instagram: Led a research team to understand the needs of small to medium businesses and their use of social media. This initiative led to the creation of the widely utilized business profiles and Instagram insights, with 25 million businesses and over 200 million users interacting daily by 2017.
  • Uber: Engaged with Uber Eats leadership in a deep dive into the world's largest food delivery market in China. These insights led to an Uber app redesign, which increased annual recurring revenue by a projected $66 million.
  • Airtable: Ran a cross-functional initiative to realign the company’s focus towards marketing and product operations, leveraging a blend of qualitative and quantitative research.

For Andy’s AMA, we tried out a new format and asked Andy to share a brief presentation before diving into audience questions. You can access the slides here and enjoy the recap of his presentation. 

View Andy's slides from his presentation here.

Andy’s presentation delved into the critical aspect of research impact, particularly focusing on its effect on products within a company. The discussion was structured around how research can significantly influence decisions within an organization and in turn make substantial impact. Before Andy began his presentation, he shared some important context. 

“First, I would like to acknowledge that I have had a few successes throughout my career, but I’ve also had my fair share of failures. I’ve worked hard, and I’d like to say I’ve worked smart to have these successes, but there is also an element of luck and I don’t want to discount that. Finally, none of these successes would have been possible without the people that I got to work with on each of the projects. Now let’s talk about research impact. 

Defining and measuring research impact has been like the search for the holy grail of our discipline. Before I dive in, I want to start with a caveat: I don’t want to pretend to have found this holy grail. Though I don’t have THE answer, I do have opinions and my opinions are based on my experiences. This also includes conversations I’ve been having with industry leaders from companies like Airbnb, Booking.com, and Uber, on how they think about measuring research impact. Additionally, my opinions are going to evolve based on new experiences and even the questions asked today.

Lastly, all my work experience in my post-academic career has been in-house and at big and emerging tech companies. I wanted to provide that context for how I think about impact. It might differ for your industry or the particular field in which you’re working. My goal today is to tell you my current thinking as well as strategies me and my teams have adopted to maximize our impact.”

Defining research impact

Andy introduced the idea of the "three P's of impact":

  • Product
  • Process
  • People

For this presentation, the spotlight was on the impact on product.

Andy touched upon a framework from Karin den Bouwmeester, which broke down impact into business, organizational, and UX research outcomes. He suggested not to aim for achieving all these impact measures simultaneously but to prioritize based on the team and organizational needs at a specific time, as well as to consider the behaviors each metric incentivizes.

Translating research into influence

Andy went on to share his view of research impact as it relates to influencing decisions. He emphasized:

  • Research is aimed at systematically investigating to significantly affect individuals (i.e., stakeholder) in making business, design, or product decisions.
  • Decisions are conclusions reached post-consideration, and this consideration is powered by systematic investigation, i.e., research.
  • The core thesis is, “we conduct research to make decisions,” where a decision could also mean opting not to take action. 
  • The atomic unit of research impact is allowing a decision to be made. Enabling an interrelated and powerful set of decisions – a strategy – is strategic research impact.

Ensuring research impact

Andy shared seven strategies to ensure research impact, some of which were derived from an article he wrote earlier in the year:

1. Associate work with an objective and key result (OKR): Andy stressed the importance of aligning all research with organizational goals, whether through OKRs, KPIs, or other frameworks. “OKRs are a goal-setting framework used by individuals, teams, and organizations to measure goals and track the outcomes,” he said. “This will help reveal and share the purpose around research and ensures the work you and your team are doing is relevant.”

2. State the desired impact: When planning a research study, it's essential to outline the desired impact the study should have, specifically on business or product decisions. “It’s almost like a self-realizing impact and is a way to set you up for success.”

3. Identify a decision DRI (directly responsible individual): Defining who is responsible for making the specified decision is crucial to hold people accountable. "My one push here is there can only be one person who is responsible for making this decision,” said Andy. “As soon as you have more than one, then you may encounter people wanting to pass the buck for decision making.” 

4. Create an analysis plan: An analysis plan outlines how the collected data will be analyzed and map to the decision to be made. It ensures that all collected data is effectively utilized. He showcased this template as an example of structuring an analysis plan. He also shared a survey design template that can be utilized to build a screener survey for your study.

5. Make actionable recommendations: Be opinionated in recommendations, aiding stakeholders in translating insights into actions. Each insight should be paired with an actionable recommendation to guide stakeholders toward what actions should be taken. “Cross-functional stakeholders are not interested merely in learning about what we've learned from our users, they want to know what action that they should be taking.” Lastly, be opinionated with your recommendations. “Stakeholders may be too busy to digest all the material and you can translate that material into an action because when you give a recommendation, you are doing so as a subject matter expert.” 

6. Outlining the what, so what, and now what: Every research report should include a TL;DR or executive summary to guide the reader to relevant content. The TL;DR should highlight the impact of the research and the decisions that were made (the “so what”), as well as the status of next steps (the “now what”). “It’s important to mention that the ‘now what’ recommendations don’t need to only be the researcher’s responsibility to determine, these recommendations can and should be workshopped with stakeholders.” 

7. Taking action: Andy encouraged researchers to take proactive action, sometimes extending beyond traditional roles, to drive impact within the organization. "Some of the best researchers I've worked with are the ones who can put on the hat of a designer, an engineer, or a product manager to create a mock, build a prototype, or even write a PRD when needed." This proactive approach is deemed essential for having an outsized impact on the company.

And finally, Andy reminded everyone of a conclusion he came to after discussions with research leaders. “Why are we and others so focused on defining and measuring the impact of research? Because we’re all working toward the same goal: Building products for our customers.” 

Are there any steps in this process you can skip or should revisit? 

Andy acknowledged that rigidly following the seven-step framework he discussed isn't always feasible, especially when constrained by tight timelines. “I’ve definitely skipped creating an analysis plan when timelines were tight and we just needed to get a survey out.” He emphasized that none of the steps he listed are "mandatory," but are techniques aimed to "optimize alignment," particularly across functional stakeholders.

Adaptability, according to Andy, is key. It's not about religiously following each step but adapting the framework to one’s company dynamics. He suggested perhaps tackling the steps "one at a time, maybe once a quarter or once a month. You just need to determine what works for you.” 

How to take action without stepping on functional leaders’ toes?

“If you see an opportunity and someone’s not taking it or acting on it, and you believe you can, then go do it and see what happens,” said Andy. “I always go back to a saying we had at Google – ask for forgiveness, not permission.” If acting on such an opportunity triggers territorial reactions, Andy said treat it as a lesson learned. 

Another strategy for advancing research is to “iterate a lot.” Instead of waiting for a project to be complete, Andy recommended gradually building out ideas, socializing them, and gaining momentum from the ground up. This tactic not only enables gaining input but showcases initiative. “This approach brings people along on the journey with you,” said Andy. And ultimately can enable you to make strides without overstepping boundaries with other functional leaders.

What do you do when there aren’t clear OKRs / meaningful KPIs? 

Andy recommended setting up a cadence of meetings with company leaders to understand and document their goals for the upcoming quarter. After these meetings, Andy would look for potential synergy between different groups to focus research efforts accordingly. With the collated insights, Andy aimed at making the "intangible, tangible" by presenting these documented goals back to stakeholders.

How to cultivate relationships with DRI

When a DRI exits a project or the company amidst a research initiative, Andy highlighted the interdependent nature of such projects. He pointed out that research is "rarely conducted in isolation from other cross-functional partners." Should a DRI exit, typically, either someone else steps into that role, or the project may be deprioritized, warranting a reevaluation of the ongoing research endeavors.

When it comes to building building relationships with partners, Andy shared he frequently gravitates toward partners in product management. “In most of the projects I’ve worked, my closest relationship has always been with a product manager.” More specifically, Andy recommends seeking out product managers in leadership roles, as they tend to be the ones setting the strategy and objectives. By cultivating relationships with people who play a role in goal-setting for the company, you make a deliberate move to align research efforts with strategic objectives. 

How to trace research impact amid shifting company priorities

Andy outlined a methodical approach to tracking research impact even when company priorities shift mid-research. The first step is to clearly define the decisions to be made at the research outset. It’s important here, Andy said, to documents those decisions you’ve made. 

When it comes to shifting priorities, Andy shared an approach he adopted both at Airtable and Dropbox: Focusing not just on the current quarter but planning for the subsequent ones. “When you’re building product, things do change super quickly. And research is not necessarily the quickest practice to be executing. It can take time and rigor.” This could look like conducting research in Q4 that sets you up for success in Q1. “This is product development,” he said. “Trying to think more ahead and get ahead of the curve, as opposed to being within the churn.”

Tracing impact as a research team of 1

“Until my current role, for every team I’ve built up I’ve always been the first researcher and built teams around me,” said Andy. Andy’s current role at Dropbox is the first time he’s led an existing team with existing practices. “It’s definitely a humbling lesson for me in terms of curiosity and learning.”

As a team of one, Andy said you have the opportunity to create processes tailored to your organization. He also advised focusing on areas of passion where you can make a significant impact and then "socializing that with your leaders to get their thoughts and how it maps to the goals of the organization and their values." By engaging in a cycle of piloting, learning, and iterating, researchers can evolve processes to encourage customer-centric and data-driven decision-making. This, in turn, could lead to buy-in for expanding the team and research scope. 

Sharing out research insights and impact 

At Dropbox, once a study is completed, Andy said the findings are primarily shared with the involved product team and further disseminated through a research-focused Slack channel named “Company Announce,” which is open to anyone interested in keeping up with research efforts. Conversely, Airtable opted for a more inclusive approach by enrolling everyone in the company to the relevant channel, aiming to foster a customer-centric and customer-first culture. “People need to be exposed to those insights if we truly believe in building products with insights first.” 

Moreover, Andy mentioned the use of research readouts at Airtable, a practice they are hoping to reintroduce at Dropbox. Additionally, a monthly newsletter called “The Customer Snapshot” is circulated across various parts of the company. This newsletter not only highlights key takeaways from the month's research but also includes insights generated by other departments (an example of this being a CSAT report from the Customer Experience team or a piece of research from the Marketing team). “We’re really trying to push the boundaries and provide a representation of customer insights.” 

Collaborating across insights disciplines

At Airtable, Andy and his team envisioned their research function as a center of excellence for customer research, aiming to centralize insight gathering within his team while still maintaining a collaborative approach with other customer-facing teams. For instance, they often reached out to Customer Success Managers to understand their knowledge about a problem space, synthesizing this as secondary research to inform new research efforts and triangulate data. This collaborative approach extended to cross-functional projects, like one where they joined forces with the data science team to analyze customer use cases, engagement, and product usage. “This really helped with prioritization and understanding of the investments we wanted to make.”

Dropbox, on the other hand, has instituted an Insights Council, which comprises representatives from various insight-generating functions within the organization, like Customer Experience, Research, Data Science, and Marketing. The council aims to create more awareness of ongoing projects and identify collaboration opportunities, aligning on goals for the quarter or fiscal year, and identifying any existing gaps. Although this collaborative approach is still a learning process, it’s seen as a strategy to enhance the overall impact and coherence of research and insights across the organization. In all collaborative efforts, Andy recommends being purposeful from the beginning. 

Balancing competing priorities in UX Research

Andy shared his recent experience at Dropbox during Q4 planning, where he faced the challenge of aligning his team's OKRs amidst competing narratives and initiatives. One of the exercises he went through was identifying synergies across various teams to form a focused OKR. “Now, it can be tempting to put every initiative that you see there,” said Andy. “I believe that’s a mistake.” He likened doing this to a term he encountered during his PhD, "stop shotgun referencing." Don’t try to take all the wood and put it on the fire, he continued. “Really maintain your focus. Try not to dilute the impact of your research by trying to be something to everyone,” he said. “Make meaningful investments.” Ultimately, this will increase the scope of your impact.

Andy also shared a structured approach when competing priorities arise. “Unfortunately, research tends to be a pretty small function within companies and because of this we can’t do everything.” Because of this, there needs to be honest discussions with about what should be prioritized. During roadmap planning, his teams work with cross-functional partners to draft a prioritization list based on potential impact. They then set a clear boundary on what can realistically be executed within the timeframe, marking a "line in the sand." If a consensus isn't reached at this stage, the matter is escalated to higher management for a final determination. Andy noted having reviews with functional leaders to resolve any discrepancies on priority tasks, thus ensuring alignment and focused execution towards initiatives that hold the most value and impact for the organization.

Adapting research pace to stakeholder bandwidth

Andy shared a scenario at Airtable where the pace at which his team was generating research insights was ahead of the stakeholders' capacity to act upon them. His CPO shared that the foundational insights already provided had sufficiently outlined the product roadmap for the upcoming 12 to 18 months and emphasized a need to shift focus towards execution. 

The Research team then transitioned to work more closely with the product teams on the current build-out, aligning their efforts with the immediate needs of the organization. “Organizations are going to have different needs for research in different parts of the organization and at different times of the company lifecycle.” This requires your team to be agile and flexibility, said Andy, so you can actually adapt to what the company needs and values at any given time. 

Rapid fire Q&A

Below are a few pre-submitted and live questions that Andy was unable to cover during the AMA that he's answered async:

How can individuals articulate the impact of their work during job interviews and in career materials?

First, myself and many companies like candidates to use the STAR – Situation, Task, Action, Result – framework. Describe the context, what you were asked to achieve, what you did to achieve your goal, and the impact. That impact is the decision made. More specifically, the result of the research. Conducting a research study and revenue increasing my $2B sounds impressive, but was it the result of the research. What did the research inform that enabled that revenue gain to happen? Finally, ensure there is a through-line from the context to the impact.

How should researchers evaluate the value created by their own work?

There are four expectations/hopes I have for research:

  • Identify a new investment/opportunity
  • Change direction
  • Stop something from happening
  • Derisk a situation i.e., increase confidence

Here is what I don’t want to happen. I don’t want anyone to say your research was “good to know” or “interesting.” A piece of my soul is sucked out of me every time this happens.

What frameworks have you used to measure and prove UX Research impact?

Measuring the impact of UX research is the holy grail of our discipline – I don’t have the answer. Here is my current thinking though:

  • Identify the decisions to be made
  • Document the decision made as a result of the research

How can researchers create impact amidst frequently shifting goals and priorities?

Why do goals and priorities change? I would argue that a lack of confidence is one reason. Research can help provide confidence by making customer-centric, data-driven decisions. But the focus of our work is also important. We should not focus on what is launching this quarter. We should be focused on what we are building next quarter and beyond. This ensures that we build confidence in the decisions that are within our control.

In light of current layoffs and job uncertainty, what strategies can managers adopt to ensure job security and safety of their teams? 

Ensure that research is associated with a company Objective and Key Result (OKR). Ensure that company leaders are aware of the research being conducted with their organization and its purpose. Ensure that you focus on the craft of research and differentiating it from a customer conversation. Ensure that you are evangelizing research insights and recommendations with company leaders. Ensure that you are recording the impact of your team – however you decide to define it.

How can UXRs deliver an improved user experience post-hire, and make themselves indispensable to the business?

I might sounds jaded here, but no one is indispensable. We provide a value – we are valuable – but there are ebbs and flows. The CPO at Airtable once said to me that we had enough foundational research to define our roadmap for the next 12-18 months. They now needed to execute. My team was focused on strategic research and the company had reached a point that it did not need that. After a RiF the team focused on more evaluative research working closely with product. If you want to be indispensable, be agile and flexible, and understand what your company will need now and in the next year.

What does it say about our function if we're constantly feeling the need to prove our impact beyond the work we do?

I like to think that we have a growth mindset. We are considering ways to maximize and optimize our impact. We are also still a relatively young discipline and we still need to educate our stakeholders. That said, looking back a few decades, we have come a long way!

How do you keep your team connected to the "Why" / OKRs?

All research projects are associated with an OKR. Furthermore, each researcher has their own OKRs, which ladder into our company and team OKRs. This ensures alignment and ultimately that our work has purpose.

How do you evaluate new research teams/orgs when starting?

I start by conducting a listening tour. This is an opportunity for me to learn. I identify common themes and then start working with people to identify potential solutions. While I have opinions about what an effective solution might be, I try to enable my team to create the solution.

How/how often do you report/share from your impact tracking? And to whom?

My team reviews our OKRs every month and we grade our OKRs every quarter. I report out to my manager every quarter and he reports his progress, which includes my teams work, to the executive team.

If you are conducting generative research where "you don't know what you don't know," and you don't have clear decisions to make, how do you define the impact as in your example?

This type of research is like searching for a needle in a haystack. Don’t be surprised if you don’t find the need.

Now at the same time I don’t want to discount unexpected insights that emerge from research. These are the cherry on the cake. They should not be the impetus for a research study though.

Connect with Andy

If you enjoyed Andy’s AMA: 

Thank you, Andy!

We’re grateful to Andy for joining us and sharing his insight, experience, and passion for ReOps. If you’d like to watch the full webinar, follow this link.