How Product Stewards at HP and Shell Report Key Metrics
Metrics are key to communicating the business value of product stewardship, but knowing what to measure has been an ongoing challenge. Much of the work that product stewards do is intangible or difficult to accurately quantify. For example, how do you quantify the economic impact of completing a regulatory submission faster than expected? Likewise, the diversity of businesses and geographic locations makes it difficult to establish cross-cutting metrics.
So how should product stewardship programs move forward? Members of the Product Stewardship and Regulatory Affairs Council (PRSA) proposed some answers to that question in a recent report. Measuring the Performance and Business Value of Product Stewardship outlines an overall approach to metrics program design and describes a broad menu of potential measures in two areas: core stewardship work and “business value” work.
PRSA Program Director Rob Shimp gave an overview of the report and some suggestions for implementation. The first step is to be clear about your purpose of capturing data. Is it to improve the performance of the organization? To ensure you have the right people in the right jobs? Is it to quantify how product stewardship contributes to the growth of the business? Or to aggregate the overall health of the organization? “A good metrics program can be any one of these or all of these,” said Shimp. The key is to specify what you want to accomplish. “Otherwise you can choose any one of the thousand things to measure, but they may not be fit for the purpose,” he said.
How HP Does It
Kathy Brewer, senior program manager at HP, explained that gathering data wasn’t the problem. “We don’t lack for metrics,” she said. But HP’s chief supply chain officer wanted those metrics rolled up into a single metric that would indicate how the function was performing.
Initially, Brewer’s team considered creating a weighted index, but she wasn’t sure that the final number would mean anything. A peer reminded her that metrics are meant to trigger conversations and actions. If one performance area is doing well, but another is a red flag, a weighted average might hide that fact. Instead, he recommended using dashboards, and reporting the metrics that needed attention. “If everything’s good, you don’t need the conversation. If something is not going the way you want, trigger the conversation. That’s what we started looking at,” said Brewer.
For their top level dashboard work, Brewer and team focused on lost and delayed revenue. They set dollar thresholds corresponding to green, yellow or red to reflect if products were delayed getting into a country, for example.
Another metric is related to the company’s takeback operations. Brewer reports on the program’s financial standing. They also report on response rates to requests and inquiries that come in for customers from sales. The goal is to respond to those inquiries on-time.
How Shell Does It
Jason Andrews, product steward at Shell LP, said metrics help the company’s business leaders have a sense of the group’s overall health. The data is designed for internal stakeholders to make decisions, rather than as a benchmarking exercises.
The metrics are tied to specific company goals, including financial performance, operational excellence and others. The group looks at key performance indicators aligned with those objectives. For instance, the length of time it takes to issue an SDS is an indicator tied to operational excellence. The company uses a dashboard to visualize its performance.
Another objective is to ‘master regulatory complexity.’ Visually, this metric is conceived as a tank that fills up. You set a target for number of ‘regulatory wins’ in a year and employees report on them. A ‘win’ isn’t too strictly defined, said Andrews. It could be for completing a regulatory submission sooner than expected, for example. The point is that if you set a target – say 20 per year – and you only have three at the 3rd quarter, then you might not be focused enough on the goal, and it triggers a conversation.
Andrews also noted that the group’s dashboard includes comment fields. When reporting the data, the comments fields also help start conversations when necessary.