Data is a key part of product management. We gain our intuition by looking at data. We come up with hypotheses based on our observations of data. We test and validate these hypotheses using data. And we make key product decisions and monitor and track changes in data. In a nutshell, data helps us go from ideas to facts to decisions.

However, working with data is also wrought with pitfalls that every product manager should avoid. Data can sometimes be misleading or incomplete. It might not tell the whole truth and lead you in the wrong direction. And it might amplify your own erroneous assumptions.

We interviewed several seasoned product managers about the data traps that PMs can fall into and how they can avoid them. We asked them the following questions:

  • What are common data mistakes that you usually spot?
  • What are the most effective methods to mitigate the data mistakes PMs make?

We would like to thank all the product managers who shared their experience with us and helped us answer these important questions

→ Test your product management and data skills with this free Growth Skills Assessment Test.

Learn data-driven product management in Simulator by GoPractice.

Learn growth and realize the maximum potential of your product in Product Growth Simulator.

→ Learn to apply generative AI to create products and automate processes in Generative AI for Product Managers – Mini Simulator.

→ Learn AI/ML through practice by completing four projects around the most common AI problems in AI/ML Simulator for Product Managers.

Q: What are common data mistakes that product managers make?

Depending on how you look and work with your data, you can learn and extract different insights. You must know what your data covers and doesn’t cover, what the blind spots are, and what you might have been overlooking. Here are some of the data mistakes that can steer your product work in the wrong direction.

Mistake 1: Not thinking about the data and KPIs that verify your hypothesis before implementing a feature

In many cases, PMs get so caught up in building and releasing the product (or a feature/sub-product) on time that they don’t think about what comes after. They don’t stop to reflect on the data they need to collect and the metrics they must measure to evaluate the product’s added value and its effectiveness in solving user problems. This often leads to product teams making decisions based on misguided intuitions or making hasty and late-stage efforts to implement data-collection features after releasing the product.

Bart Jaworski (PM Teacher and Senior PM at Microsoft)

The worst mistakes are when the PM/team is so overworked that the tracking dashboard for a feature is not put together before the feature is released. As a result, the team has to later interrupt its work to put together the tools required to collect data, run tests, and validate hypotheses. 

Michal Bloch Ron (Senior Product Manager at Microsoft Teams)

The first product I’ve ever worked on was a total failure. In retrospect, we didn’t define the right KPIs in the beginning, and that shifted our focus from successful usage to successful technology. We built a meeting optimization and recommendation experience for users’ work calendar. Our KPIs were mainly focused on the AI model accuracy and success, rather than the actual experience of the user. That made us focus too much on validating the model instead of finding its product/market fit.

We had to cut the project after spending months only building technology because we didn’t use data early enough to validate the user value and user experience.

What I learned was that I should focus on getting something out fast to start measuring. This will help reach product/market fit quickly or fail fast and iterate. We prioritize the product experience in our KPIs. The KPIs are prioritized KPIs that are mapped and contribute to a bigger story.

For example, in one of my recent projects, we decided that a key metric in the MVP would be % of users who finished the flow and % who returned to the experience, rather than awareness of the new experience. So we focused on making the experience reliable and completed end-to-end instead of investing in discovery.

After we validated that the experience is successful, we invested in developing discovery features to expand our user base.

Mistake 2: Mistaking correlation for causation

Products can be tracked across many different metrics and data points. And while the changes in several data points might be correlated, it does not necessarily mean that one causes the other. It is crucial to differentiate between correlation and causation when trying to identify the true relation between different variables. Not doing so can result in misleading experiments, misguided decisions, and wasted time.

Karen Geva (Senior PM at Gigya-SAP)

As a PM I work closely with data and product analysts who are in charge of analyzing the usage stats of a product. I noticed that many times, as the analyst is not always aware of the product strategy and familiar with the personas that engage with it, the cause of the trends or correlations identified is sometimes missed. Therefore, as a PM I usually try to drill down into the numbers with the analyst, ask questions, and request additional reporting. On the other hand, I try to share as much knowledge as I can with the analyst so he/she will be able to effectively identify interesting insights.

Mistake 3: Falling into the confirmation bias trap

Product managers have hypotheses and assumptions about their product and users. If they’re not careful, they can fall into the confirmation bias trap, where they start cherrypicking data that confirms their hypotheses and ignore other signals that draw the bigger picture and might point to the contrary. 

Joel Polanco (Senior PM at Intel Corporation)

One big mistake I’ve seen teams make, particularly in the discovery phase, is confirmation bias. New product managers or product development teams have a tendency to believe that their solution hypotheses are going to be the actual solutions that end up scaling out. They’ll go out and start interviewing potential customers and cling to responses that support their solutions. As a result, they fail to empathize with their customers’ pain points and fail to address their customers’ true underlying needs.

Mistake 4: Working with inconsistent and/or corrupt data

Make sure your decisions are based on high-quality data. PMs should be mindful of corruption, inconsistencies, missing information, and other problems in the data.

Bill Leece (Senior PM at Google)

There are two types of mistakes that stem from the data itself:

  1. Differences in the same metric across systems:

Something that’s quite common is finding discrepancies in the same metric generated from two separate data sources. Ideally, you have a single “source of data truth” so that this doesn’t happen. When you find such discrepancies, it is important to understand what’s driving the differences. Often due to different latencies (“data freshness”) of different systems. Sometimes it is due to slightly different definitions of a metric across different systems, which is why it is important to ensure that metric definitions are clear, consistent, published, easy to find, updated with a clear change history, and reviewed periodically. It sounds easy but in reality, this is an operational challenge that requires organizational discipline.

  1. Lost or corrupted data:

Having copies of key data so that “lost” or corrupted data can be reconstructed is a must. Where PM is more involved is in helping engineering and operations teams identify scenarios where data can be lost or corrupted and helping to ensure that there is a robust detection and monitoring system in place.

Probably the worst data-related mistake I’ve personally made was not having sufficient measurement and alerting systems in place to detect problems in a complex software system. Without robust alerting systems that are constantly checking the existence and quality of your underlying data, you run a real risk that issues will be detected by… the customer. You don’t want that. The immediate cost can be easily measured in terms of lost orders/refunds, but the future cost is more likely the loss of a customer or customers, and thus the loss of the lifetime value of one or more customers.

Building robust alerting and monitoring systems and clearly defining operational processes that need to be enacted in various scenarios is key. In a larger organization, this is likely not part of a PM’s job, but if you’re in a startup, you should assume it is your job unless there is a clear and explicit understanding that someone else or some other group (engineering and/or operations) is accountable. And even if others are accountable, to be successful, PMs need to be stakeholders—defining and reviewing the error detection scenarios and operational processes and resolutions.

Mistake 5: Relying only on data in the decision-making process and presenting bias in the story

Another common mistake is relying solely on data. There are many situations where data alone won’t help make the best decision and will distract from other important factors. In some cases, PMs should use qualitative research or search for and rely on deep expertise.

Michal Bloch Ron (Senior Product Manager at Microsoft Teams)

Data is one of several signals for us to spot problems or validate hypotheses. It’s not likely to explain the root cause or bring us revolutionary ideas. In B2B products, in which there’s usually direct access to customers, relying solely on data could mislead us or make us think narrowly.

Data is not a replacement for qualitative feedback, and vice versa. Our customers are the best place to understand the trend we see in the data, design an idea, test it, and return to the data to see if that idea worked or not.

Data helps to tell a story—the story we want our audience to perceive. 

Numbers are effective elements in illustrating key messages, and context plays a big role. Presenting usage or growth in absolute numbers (#) can look very impressive, but once we turn it into percentage and look at the account proportion out of the total market/user base, we get a different story.

Data helps us to prove the why, but great ideas won’t come from just analyzing data. Some data trends might be anomalies, and some could point to something important we should improve.

Mistake 6: Using the wrong processes and tools to make inferences from data

There are many potential mistakes that PMs and product teams make when analyzing experiments, product changes, and metrics fluctuations. It’s important to know how to get from data to decisions.

Oleg Ya (Founder at GoPractice, ex-Data Scientist at Facebook/Meta)

The area where many PMs and product teams make mistakes is drawing conclusions from data. There are many mistakes you can potentially make when trying to decide based on data, even if it is high quality. For example, product teams often use the wrong statistical tests to calculate if the difference in the metric is statistically significant. Some teams don’t use statistical tests at all. Another common mistake when analyzing A/B tests and the impact of product changes is the peeking problem or exposing users who don’t experience any changes in the product to the experiment. There are many more: using the wrong metrics to make decisions, calculating metrics in the wrong way, relying on third-party data without knowing its source and collection methodology, to name a few. The key point here is that it is important to invest in the skills required to transform the data into high-quality decisions and insights.

Mistake 7: Overestimating the significance of your data

One of the mistakes product managers might make is to generalize the insights they gain from their data beyond its true boundary. In reality, every market segment, application setting, environment, has its own sensitivities and dynamics. 

Joel Polanco (Senior PM at Intel Corporation)

I’ve been exploring computer vision and artificial intelligence technologies. One of the worst mistakes I made was assuming that because we had a customer who was willing to share video data with us, we would have the necessary video data to build models that would generalize and scale to other customers. We spent a lot of time acquiring and annotating data only to find out that the models that were built would not generalize well, they would only solve very specific, niche use cases. 

Knowing what I know now, I always go back to thinking more broadly about artificial intelligence problems and the importance of defining your data acquisition strategy from the get-go. Many companies will tell you they have an artificial intelligence strategy. But do they have a data strategy? If the answer is “no” or “we are still working through that” then the fact of the matter is, they don’t have a strategy; they have aspirations.

Q: What are the most effective methods to mitigate the data mistakes PMs make?

While every new product comes with its own nuances and potential pitfalls, there are some practices that can help you minimize mistakes and be better prepared to spot and handle failures before they cause irreversible damage to your efforts.

Experienced product managers had the following key recommendations to share:

  • Carefully identify and choose your KPIs to make sure you’re tracking and monitoring the right data
  • Build robust alerting and monitoring systems that track key data points and metrics
  • Always keep an eye on your most important metrics, both quantitative and qualitative
  • Improve your data querying and modeling skills
  • Be aware of and careful about your biases 

Karen Geva (Senior PM at Gigya-SAP)

Try to think about what is the main goal of the feature or product, identify the KPI that will help you identify if you have achieved your goal and then make sure the data required to calculate those KPIs is collected.

Joel Polanco (Senior PM at Intel Corporation)

One of the most effective methods I’ve seen for all product managers is to assemble and lead periodic metrics reviews. Identify important metrics, assemble them systematically, and report on them. These metrics should include both qualitative and quantitative metrics that either prove or disprove your hypothesis, show that your team is making progress, and most importantly, that your team is driving towards an outcome. It is important to develop data feeds from a variety of sources (sales, finance, customer interviews, etc.) whether you are in the discovery, delivery, or scaling phase of your initiative.

Bill Leece (Senior PM at Google)

Be empowered: As a PM, you need to be able to pull data from various sources yourself and analyze that data via SQL. You should not have to rely on engineering to answer key “SQL-answerable” questions. 

Be proactive: Make sure you have robust alerting and monitoring systems in place to detect data quality issues early—if a customer detects a data quality issue before your organization has detected it, that should be a huge red flag—your company has an urgent problem that needs to be solved.

Be self-aware: You have biases and you should strive to step outside of yourself and your ego (we fear being wrong, we fear being ridiculed – part of being mature is that you get over these fears to focus on uncovering ‘truth’, even if it’s not what you thought it would be or indicates that you have some blind spots) to understand your biases. A significant amount of ‘data mistakes’ can be mitigated by just not being arrogant,  being open/humble and listening, and not feeling personally attacked if someone finds something wrong with what you’re doing (as long as they’re also not arrogant in the process)

Michal Bloch Ron (Senior Product Manager at Microsoft Teams)

  • Define success from the user perspective and not from the technology perspective. For example—% of users who clicked through, rather than what is accuracy score. It might be that the technology is not perfect, but users find value in it
  • Define measurable and actionable success 
  • Strategize—prioritize KPIs to prioritize feature work
  • Talk to users and get feedback all the time—use data along with qualitative feedback
  • Measure a single change each time to easily analyze, diagnose and iterate
  • Use data and KPIs to tell a story when presenting to customers, colleagues and management – know your audience, their questions and goals