Recently on this site, one of us wrote about the new product development analytics used by Netflix. In a nutshell, the company classified the key attributes of past and current products or services and then they modeled the relationship between those attributes and the commercial success of the offerings. This produced a predictive model that provides the company with guidance about how likely a new product or service is to be successful.
Netflix is not alone in this respect. Procter & Gamble also uses predictive analytics to introduce new products faster than the competition, and the company works diligently on predicting the likelihood of market success for these products. The company integrates virtual reality and 3D design tools to create realistic virtual prototypes. Moreover, P&G uses data and analytics from focus groups, social media, test markets, and early store rollouts to plan, produce, and launch new products.
As P&G and others have discovered, there is more to commercial success than new product selection. New product development is a broad process, with many steps. There is little doubt that smarter decisions are necessary in the entire process; a Deloitte study, for example, found that 96 percent of product innovations fail to return the cost of capital, and two thirds fail within two years.
The power of predictive analytics is multiplied when an organization takes an end-to-end process view of new product development (NPD). Idea generation and business case decision making are important. But an end-to-end view of performance in a business-process context provides additional opportunities to apply predictive analytics to improve performance in other areas, such as product development, testing, and launch.
Analytical methods apply to each of these steps. For example, in the area of product creation, it’s possible to improve performance by classifying key attributes of past success – such as early supplier involvement, broad cross-functional collaboration, use of key metrics to move from one gate to the next, etc. – and then model the relationship between those attributes and the commercial success of the offerings.
Similarly, in the area of new product testing, it’s possible to improve performance by classifying key attributes of past success – such as customer involvement, sales force collaboration, and key metrics, etc. – and then model the relationship between those attributes and the commercial success of the offerings. The same principles apply to planning and executing product launches, where a set of key attributes of past success can provide insight into what needs to be done to assure future success. The key in all of these areas is to collect data on attributes of the NPD process and then relate them to product success in the marketplace. This requires consistency, discipline, and a long-term perspective.
By taking a process-based view of the performance of NPD, organizations can improve their ability to view things form their customer’s perspective. For example, the typical company cares a lot about the cost of new product development and the ROI on their investment, while customers care mostly about other things such as value for money, on-time product introduction (when promised), and quality (works right first time).
By taking a customer focused, business-process-based view, organizations can gain new insights into why their NPD performance is below expectations. Analytics can play a big role in overcoming the major obstacles to decision making in NPD, as identified in an Aberdeen Group survey.
As the Aberdeen survey suggests, part of the problem with NPD analytics is that systems that supply data for new product development are fragmented and incomplete. Much of the data necessary for assessing customer demand and competitive responses is external to an organization. Some vendors, such as Signals Group (disclosure: Tom Davenport is an adviser to the company), are now offering systems and external information that support the entire NPD process. In addition, the rise of product lifecycle management (PLM) software from vendors like PTC, Autodesk, Dassault Systems, and Siemens has made it easier to access data from internal systems. PLM data is commonly used for reporting, but it is rarely employed for predictive analytics.
We are just at the beginning of the use of analytics for NPD. Examples such as the Netflix case should spur organizations to begin using these tools to develop more successful new products. From the beginning, however, organizations should adopt an end-to-end process perspective on NPD analytics so that they don’t optimize only a single aspect of the process.
Tom Davenport, the author of several best-selling management books on analytics and big data, is the President’s Distinguished Professor of Information Technology and Management at Babson College, a Fellow of the MIT Initiative on the Digital Economy, co-founder of the International Institute for Analytics, and an independent senior adviser to Deloitte Analytics. He also is a member of the Data Informed Board of Advisers.
Andrew Spanyi, the author of three books on process improvement and management, is the Founder of Spanyi International Inc. He has worked in the area of process management for over two decades and he is on the Advisory Board for the Association of Business Process Management Professional (ABPMP). He is also the Editorial Director for BPMInstitute.org.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise, plus get instant access to more than 20 eBooks.