Data products create value by facilitating business goals. They ensure this by enhancing the access to data and improving the utility of data. Creating successful data products could play a crucial role in solving today’s business problems. Let’s journey through key aspects that can improve the probability of success of the data products we create.
Today, organizations cannot overstate the value of facts, analytics and insights. Equally important are fundamentals such as having a customer focus, measuring the impact of our data products and enabling customers to acknowledge the value our data products add to their core business.
Foster Innovation
Most often, data products are created as a response to a business opportunity or problem. When a company encounters a business opportunity or problem, it is vital to begin seeking support by defining the opportunity or problem unambiguously. Companies need to communicate the business consequences clearly. Then, it is in the hands of the expert teams – be it external or internal – to understand the business opportunity or problem and come up with optimal solutions. If the problem definition already begins specifying parts of the solution, there remains no possibility to come up with holistic solutions. Or combinations of different parts of a solution that have not been used earlier.
Let’s be mindful of leaving room for innovation.
Build to Address a Need
Most failed initiatives point to not having a clearly-defined need as the primary reason for failure. Consider the Google Glass – an innovation that failed predominantly due to a lack of alignment with user needs. This made it challenging for consumers to perceive its practical utility beyond its novelty.
In order to build to address a need, it is imperative to uncover the true value of data. To do that, we begin by formulating a hypothesis based on observations. Then, we seek the quickest way to collect evidence to either support or refute the assumptions made. Then, we understand not only how current customers could use them, but also potential customers. Consider the potential data holds to be commercialized by assessing its ability to enhance customer core business holistically. It should not just meet an isolated need or produce an isolated benefit. A complete overhaul is more beneficial in the long run than applying smaller fragmented solutions to the business piecemeal.
Understanding the value our data can offer lays a strong foundation for a data product and increases the probability of its success.
Understand Value Added
Every data product is set to evolve with time to support multiple outcomes, addressing emerging use cases. The data product eventually becomes a future-proof, reusable asset. But beginning with an understanding of its unique value proposition anchors all efforts towards adding value. What can help us arrive at its value proposition are gaining market insights, an understanding of the value it can add to the customer’s business and a keen end-user focus.
This anchoring likely prevented Oral B from adding a ‘play music’ feature to their electric toothbrush. Instead, it encouraged them to add a convenience feature to order replacement heads by connecting toothbrushes to phones and sending reminders.
Over time, data products deliver far greater returns despite incurring upfront costs. AirBnB realized this when they started providing high-resolution images of their accommodations with much added expenditure. Soon, their revenues doubled. Thus, it turned out to be an investment and not a cost really.
While it is essential to understand the customer, that alone is not sufficient today. It falls short. When we take efforts to understand our primary customer, it is relevant to understand not only the customer segment in general but also their core business, what enables their business and what hinders their business. Combining this, with our knowledge of the available data and our understanding of how it can add value to their business, paves the way for creating useful data products.
As we approach the launch of the data product, it is useful to apply the Pareto principle¹ to launch before we feel entirely ready to do so. In terms of adding value to the customer, done is far better than perfect.
Embedding feedback collection tools within our data product makes it convenient for users to share their feedback with us directly and enhances the probability of us receiving actionable feedback. Measuring the impact of our data product by tracking adoption and usage enables us to bring learnings back into the build phase, thereby completing the build-measure-learn loop. Around the time of launch, it is useful to measure user engagement, user acquisition, data product performance and so on.
By utilizing these metrics to learn, we can improve our data products iteratively and continuously.
Enable Value Acknowledgment
It is necessary to enable customers to acknowledge the value added by our data products through training and education, where needed. Though all data possesses intrinsic value, it takes focused effort to commercialize it. Packaging the data such that it addresses specific needs for specific customer segments paves the way for enhanced monetization. Assessing return on investment precedes identifying the most suitable pricing model.
Based on the target customer segment, the data product’s potential and other related aspects, the most appropriate pricing model is chosen from among the many available such as cost-driven pricing, competitor-driven pricing, value-based pricing, tiered pricing, freemium pricing and bundle pricing. Tesla adopts value-based pricing by emphasizing the unique benefits of its vehicles. And Starbucks implements value-based pricing by focusing on the overall customer experience and the perception of quality.
Stay within Ethical and Governance Guardrails
While sharing data with external parties within the broader ecosystem, we need sharp expertise to approve it from the perspectives of intellectual property rights, data privacy, cybersecurity, and legal compliance. We must scrutinize whether the underlying data consists of trade secrets, personally identifiable information, or exposes attack vectors, or violates regulatory requirements. This paves the way for enhanced transparency that builds trust over time. We all know how Facebook was fined for its role in Cambridge Analytica data harvesting. We are equally aware of data breaches that occurred at Tesla and many other companies that we hear about ever too often.
At a time when we grapple with hallucinating AI tools, what can ensure a smoother application of this latest technology are appropriate investments in governance and training.
Organize in Data Product Teams
According to Conway’s law, “Any organization that designs a system will produce a design whose structure is a copy of the organization’s communication structure.” This is also reflected in the BAPO model – The Business, Architecture, Process, Organization model. This model states that business strategy should drive architecture and technology choices. These should drive processes, ways of working and tooling choices. These elements should, in turn, define an organization that is best suited to realize the company’s business and technology strategy.
It is conducive to organize as data product teams in order to create successful value-adding data products. The best data teams stay value-driven and impact-focused all the way, by prioritizing usability and design over and over again. Through impact measurement, if a data product has no users, the best cultures celebrate while phasing them out.
Change Direction, Where Needed
This graph emphasizes that we cannot pick the winning ideas without first investing in the losing ones. It is important to accept that many of our ideas are likely to fail. This allows us to embrace failures, try to fail quickly, learn from those experiences and pivot, where necessary. The trick is in identifying the earliest point in time to pivot. Slack made a major pivot when they shifted focus from games to team communication. And Instagram made a major pivot by shifting focus from location-based check-ins to photo sharing. Also, lessons identified are not the same as lessons learned. Teams need to take conscious efforts to reflect and learn from experiences!
In an era where industries are undergoing tectonic shifts and major transformations, the value one can offer via data products could make all the difference. This could be the key distinction between an industry leader and the rest of the pack!
¹Investopedia. The Pareto Principle is a concept that specifies that 80% of consequences come from 20% of the causes, asserting an unequal relationship between inputs and outputs. The Pareto Principle is also known as the Pareto Rule or the 80/20 Rule.
About the Author
Charanya Thangaraj is Service Area Manager of Data Access Services at Connected Services in Volvo Buses. She is responsible for commercializing products built from connected vehicle data.
Charanya is a seasoned data professional in fintech and automotive, dedicated to leveraging data for driving business value effectively.
Her data leadership in the automotive industry includes leading enterprise-wide data governance efforts in the autonomous driving space, enhancing transparency of electromobility supply chains with blockchain-traced data and creating value-adding data and insights products.
Moreover, as one of our speakers at the ninth annual Data Innovation Summit in Stockholm, Charanya presented on creating value-driven data products! Gain in-depth insights as she shares strategies for elevating your data game and driving business success. Moreover, tune into a recent on-site interview at the summit, where we had the chance to speak with Charanya!
For the newest insights in the world of data and AI, subscribe to Hyperight Premium. Stay ahead of the curve with exclusive content that will deepen your understanding of the evolving data landscape.
Data 2030 Summit 2024: Navigating the New AI Era with Transformative Data Management!
The Data 2030 Summit is an annual gathering where the Data Management community discusses accelerating data innovation and AI deployment through modern strategies, platforms, and architectures. This year, the summit will feature 3 stages: Data Strategy & Governance Stage, Modern Data Platform Stage, and Data Architecture and Quality Stage, and 30+ international speakers leading discussions on Information Management, Data Governance, Master Data Management, Data Quality, Enterprise Architecture, DataOps and more!
Don’t miss your chance to join the Data 2030 Summit – a gathering set to shape the future of AI-driven organizations! Attend online through Agorify or in person. This is your chance to gain the knowledge, skills, and connections to thrive in the new AI era. GET YOUR TICKETS NOW!
Add comment