Techniques for Entrepreneurs to Drive Insights from Primary Research: Advice from a Student VC — Volume 5

248 Builders
13 min readDec 24, 2020

--

“Excuse me please, but I’ve got to sort this puzzle out,” begins the late business professor Clayton Christensen as he recounted how he and his colleagues approached morning commuters, milkshake in hand, as they exited fast-food restaurants. “What job were you trying to do for yourself that caused you to come here and hire that milkshake?”

In what has become a popular anecdote showing the effectiveness of the jobs-to-be-done approach to understanding customer problems (which Steve Shafran explains further in his article), Christensen described how, with the goal of figuring out to how to improve a fast-food chain’s milkshake sales, he and his colleagues discovered that when people purchase a milkshake early in the morning, they do so because they are fulfilling the desire…

  1. to have a quick and filling breakfast that is easy to consume in the car and
  2. to keep themselves preoccupied during their commute.

As Christensen framed it, people often choose to “hire” a milkshake to fulfill these jobs because, in comparison to other typical morning-commute breakfast foods like bananas or bagels, it excels at being easy to consume in a vehicle, taking much of the car ride to consume, and keeping its consumer feeling full until lunchtime.

“If you understand how to improve the job,” he summarized, “improving the product becomes just obvious.”

Importantly, to reach that understanding, Christensen et al. first gathered a wealth of information by interviewing consumers and prompting them to reflect on what it was that they were hiring a milkshake for.

Using conversations like these to generate insights into your target market is one of the more important skills an entrepreneur, product manager, or anyone working in product definition can have. In an acclaimed book on the topic, Talking to Humans, author Giff Constable stated in the introduction that,

“The qualitative part of customer discovery is surprisingly hard for most people, partly because talking to strangers can feel intimidating, and partly because our instincts on how to do it are often wrong.”

From my experience with primary customer research and other similar interviewing scenarios, I would also point out the difficulty that comes from balancing context building, sticking to a prepared set of questions and learning objects, and being the host of an organic conversation in which the interviewee feels comfortable and willing to engage in the introspection you are asking of them.

There exist many resources on the topic — like Talking to Humans — that serve as references for people of all experience levels on how to engage in this kind of research. To complement these resources is:

  1. Before and during the interview: A compilation of strategies that I have found helpful for leading successful interviews with stakeholders.
  2. In your periphery: A review of common shortcomings that can undermine the integrity of your conclusions.

Before the interview

There is only so much you can do to prepare before you “get out of the building” as Constable puts it — and vigorously encourages — in Talking to Humans. However, the preparation you do beforehand is critical to unlocking the success of later steps in the process. Bill Aulet in his Disciplined Entrepreneurship Workbook, for example, identifies “Lack of Structured Process” as first in a list of the five biggest obstacles to good primary market research.

As a compression of Aulet’s recommendations with some adaptation, pre-engagement preparation should include:

  1. A criterion for who you want to interview and a plan for how to engage with them (i.e., your strategy for how and when to ask them to share some of their time with you),
  2. Questions to guide the conversations.
  3. Supporting materials, such as a website mockup, a simple product sketch, or a low-fidelity 3D prototype.

You should define the goals of your research and use those objectives to determine your approach. To demonstrate what this can look like, I have included at the end of this article an example of my pre-interview objective definition for past project of mine. A key takeaway from this experience and others is this: at any given point, you are operating within a subset of all the possible directions you could take with your research, and you arrived at that point through a non-linear process of gathering prior information, synthesizing that information, and updating your mental model of the problem to be solved. Because of this, you should be explicit about your assumptions that you derive from your research findings and use subsequent research to challenge those assumptions.

During the Interview

Here is a compilation of a few techniques I have found to be helpful in making interviews successful:

  • Be overtly curious: It is advised that you write down your questions beforehand, but chances are, you did not spell out in your bullet point list, “I would love to learn more about x…” Assuming that you do not have to feign curiosity, show your interviewee that you are genuinely excited to learn from them. In other words, stick to the content of your questions, but improvise on their delivery because people will be more willing to open up to you if they believe you are genuinely interested in what they have to say.
  • Do the active part of active listening: Being an active listener is something that everyone has heard at some point as blanket advice for having an effective conversation. It is advantageous for you as the inquirer to synthesize in real time what is being told to you and to ask situationally specific questions to generate more insights.
    - Implicit in your list of questions should be your intention to follow them up with inquiries probing your interviewee for more information. Some examples for how to word these include: “We noticed that…,” “It seemed like…,” [To follow up to a straight “no” in response to a binary question] “I wonder whether…,” or a five-whys questioning (asking “why” five times in a row to identify a root cause).
    - A tactic I have found particularly effective when trying to parse what I have just been told is to share my interpretation of what the interviewee has just described and ask them, “Is this a fair summary of what you just described?” or, “Is this inference that I just made accurate?” This is distinctly different than, “So what you’re saying is…” because it invites the interviewee to correct misunderstandings.
  • A “magic wand question” alternative: Talking to Humans will advise against you asking, “If you could wave a magic wand and have this product do whatever you want it to, what would it do?” I agree with Constable’s assessment of this question, and while he offers an alternative that focuses on asking about what problem the interviewee would solve with a magic wand, I will go one step further and recommend a technique I have been taught and will refer to as the Before and After method:
    - Before: Prompt your interviewee to describe the current state of x with x being related to the problem you are trying to solve. Have them point out pains and points of friction in the current state of x. After: Then, ask your interviewee to imagine what x would look like if those pain points were removed or mitigated; do not focus on the how, focus on the what.
    - Using the “Before and After” method is enhanced when your interviewee can sketch out — either in bullet points or in drawings — the before and after states; it helps for both parties to have a visual reference to the before state of x when talking through how it could be different.
  • Use superlatives: A way to spark some insightful responses from your interviewees is to prompt them with questions that are worded with superlatives — especially when those questions prompt the interviewee to relay a personal story. For example: “Describe an experience that you believe is most representative of why you find x to be challenging.”
  • Find a co-interviewer: If possible, conduct the interview with another person. There are multiple reasons why I believe conducting an interview with a colleague holds advantages over doing it yourself:
    - By interviewing with another person, you allow for a division of labor between note taking and driving the conversation. Audio recordings can stand in for live notes absent of a live note-taker but having someone taking notes can enable the capturing of information otherwise lost to audio recordings like body language. Additionally, for more in-depth conversations, having paraphrased quotes written down to refer back to when asking a follow-up question is a powerful way to show active listening and prompt introspection from the interviewee.
    - Furthermore, I have yet to co-interview someone where my co-interviewer did not contribute a unique question I would not have thought of myself. Having an extra perspective can bring out even more useful information from a conversation.

In Your Periphery

I argue the need to engage in primary customer research is driven by the tension between two sets of facts:

  1. The information that we have about the domain in which we are entering to solve a problem, no matter our prior expertise, is ultimately inadequate. The way in which we perceive, understand, and operate within the world is a narrow sliver of all possible ways.
  2. In order to turn an idea into a successful product, service, or business, among many other criteria, it must solve an acute-enough problem shared by a sufficiently large number of people who are willing to part with their hard-earned money to solve that problem.

If we had access to complete information about a given domain, the problems that people experience, and how they would react to solutions for those problems, there would be no need to engage directly with people. As this is infeasible, we must gather the data ourselves to make as informed decisions as possible.

What is also an unavoidable fact is that we are not infallible in how we collect and interpret data to reach informed conclusions. Being aware of our limitations can help us recognize where we can take concrete steps to make more sound decisions. Following are a couple examples of our fallibility that are important to take into consideration when conducting primary research:

Sample size

In his widely cited book on human psychology, Thinking, Fast and Slow, Nobel laureate Daniel Kahneman states that, “…people are not adequately sensitive to sample size.” Even to those who have not studied statistics, the intuition (having a mathematical basis in the central limit theorem) that a larger sample size leads to more accurate results is present. However, this does not mean that we act accordingly or with mathematical rigor even when conscious about it. Kahneman continues:

“The strong bias toward believing that small sample sizes closely resemble the population from which they are drawn is also part of a larger story: we are prone to exaggerate the consistency and coherence of what we see.”

I am not here to tell you that your scrappy primary research should take on the same statistical rigor that, say, a scientific study would. However, understanding our tendency to assign too much significance to findings from small sample sizes is especially relevant to primary market research where sample sizes are by nature smaller.

In a poignant example, Kahneman describes a $1.7B investment by the Gates foundation into education that was directed, at least in part, by findings that that the most successful schools are, on average, small. “It is easy,” explains Kahneman, “to construct a causal story that explains how smaller schools are able to provide superior education.” However, if one looked further into the data, they would find that worse-performing schools were also smaller than average. The takeaway is not that the Gates foundation’s investment was misdirected, but that small sample sizes — schools’ student populations in this example — allow for greater variability in their results.

Suggestion

In the book Make it Stick, authors Peter Brown, Henry Roediger III, and Mark McDaniel share a powerful example of how our thoughts, feelings, and/or behaviors can be influenced by others (this is known as suggestion): “In one example, people watched a video of a car running a stop sign and colliding with another car passing through.” Participants were asked the speed at which the cars collided, to which one group estimated higher than the other. What was the difference between the two groups? It will not surprise you that the group giving a higher average estimate was asked the speed at which the cars “smashed”, whereas the other group was asked the speed at which the cars “contacted.” Although, perhaps this would have been a more surprising result had I not initially suggested these results were unsurprising. You have likely been given the advice at one point to avoid leading questions, and this is one of the reasons why. Make sure to review the questions you plan to ask your interviewees and check for ways in which the wording of your question could bias the answer.

Tangential to the effect of suggestion is anchoring. As Kahneman explains,

“The phenomenon we were studying is so important in the everyday world that you should know its name: it is an anchoring effect. It occurs when people consider a particular value for an unknown quantity before estimating that quantity… the estimates stay close to the number that people considered.”

It is important to take anchoring into account if you are asking your interviewee to estimate a quantitative value, like if you are trying determine how much people are willing to pay for your product.

What next?

This article is not an end-all-be-all resource on the topic of primary research; there are many other resources you can use to bolster your primary research skills. In addition to the works I have cited above, below are a couple of recommendations directly listed in the Disciplined Entrepreneurship Workbook at the end of the primary market research section:

The most important thing you can do, as with developing any skill, is to practice. I still have a lot to learn on the topic, and I recognize that the best way for me to grow my skills in primary research is to continue to do more of it. I recommend you do the same.

Have thoughts on the approaches I outlined in this article? Leave a comment and I look forward to engaging with you there.

-Duncan Mazza, founding partner of Green Line Ventures

Pre-interview preparation example

To provide a concrete example of how you can approach the definition of learning objectives that precedes your primary research, below is a snippet of the material I generated for this process during a class project in which I sought to determine the viability of a product idea I had:

  • Summary of idea: A web-based image processing and analysis tool for scientific and R&D applications.
    - Potential users include scientists who need to do image processing (e.g., counting cells from microscope images) and engineers in R&D that have image analysis needs.
    - Value proposition: a modern browser-based user interface, comprehensive visual scripting capabilities (think drag-and-drop vs. writing code), cloud-based data storage and computation, real-time collaboration, and exporting of a processing stack to a variety of other languages or scripting tools.
  • Background research findings: Existing solutions that my target audience already uses to solve their image analysis problems include ImageJ/Fiji, MATLAB, OpenCV, Scilab, Icy, and Wimasis.
    - These solutions span both the open-source to closed source and software development kit to fully UI-based visual scripting spectra. None of the existing solutions have the full set of features I outlined in my value proposition.
    - Forums for question-asking are bountiful for these platforms and give insight into the difficulties that users have with their respective solution.
  • Assumptions:
    -
    When people use open-source image processing software, it’s because it’s free, what other people around them use, and is functionally adequate (support for plug-ins, doesn’t crash, fast, lots of features, etc.). However, they often find the user interface to be unintuitive and miss the frequent updates, superior documentation, and customer support often found with proprietary software.
    - People prefer to use a visual scripting workflow in most image analysis scenarios.
    - When using a solution without live collaboration features, users want to collaborate with their peers, but find themselves unable to collaborate effectively and/or efficiently.
  • Hypotheses:
    - Users in academic and scientific institutions will see value in having a tool at the center of their image analysis workflow that is web-based and collaborative, able to integrate with their prior solutions, and designed with a superior user interface.
    - Purchasers will be willing to pay for my solution even if their prior solution was free or if it comes as an addition to an existing expensive software license.
  • Measuring: For an early indication of product-market fit, I will look for in the following measures an affirmative response to 2/3 measures from >50% of the people I interview. The measures are: did the interviewee…
    1. Express dissatisfaction with their current workflow/solution? This would be mapped to specific information about the solutions that they use (like whether they are open-source, have visual scripting capabilities or are code-based, etc.).
    2. Express an interest in trying my solution (for free).
    3. State that they would want/use collaborative features for their image analysis workflow.
  • Static prototype:
A static prototype of the application I envisioned.

I found this preparation integral to directing the interview conversations to yield useful insights. However, in reflection on this work, something I recognize I could have improved upon is that I skipped the step of converting my hypotheses into a specific list of questions that I referenced during the interviews. Instead, I went into the interviews with the learning objectives as a mental note and improvised the conversion of those learning objectives into questions. This process did not fail to yield insightful results, but I see it as a missed opportunity.

During an internship in which I did primary research, I followed the full process of objective and hypothesis definition → specific list of questions → interviews centered around those objectives and questions → synthesis of interviews, and found that defining a specific list of questipresentons beforehand was not only helpful during the interview, but also in comparing the findings between different interviews.

Acknowledgements

My skills and experience in primary market research have been shaped by my professors and other mentors, to whom I am very grateful for their guidance.

All cartoons in this article were made by Tom Fishburne for the Talking to Humans book and can be found here: https://www.talkingtohumans.com/cartoons.html.

--

--

248 Builders

By students, for students. Backed by 500 Startups, we are enabling the next generation of Babson, Olin, and Wellesley College entrepreneurs.