I can’t say enough good things about Candela‘s service team. To a person they are skilled, dedicated, hard-working, customer-focused, and great to work with. The customers love them, so the team plays a key role in Candela’s continued market leadership.
When I rejoined Candela in 2007, Candela’s service team was understaffed and overworked, the result of a corporate hiring freeze, higher-than-expected service demand, and some turnover. While we were working extremely hard to support Candela’s customers, we were behind in preventative maintenance visits and sometimes late finishing product repairs. To me, it was obvious we needed two changes: more service staff and product reliability improvements. How could the rest of the company have missed this? How could we set up a system to make sure we meet our customer commitments in the future?
When I talked to other managers in the company, they knew that the service team was burning out, and that there were real product issues. But I found little sense of urgency. Each manager was focused on his or her own issues. I realized that our service team needed to communicate their issues better. I also realized that our service team’s issues were our customer’s issues.
So we turned the service department’s internal planning metrics into customer-facing metrics. These revised metrics show the company’s service performance from the customer’s point-of-view. In reality, it’s the same data displayed a different way. But sometimes, the presentation makes all the difference.
Let me give you some examples of our customer-facing metrics.
Our service department had been tracking mean-time-between-failure (MTBF), a common service metric. Customers, though, are not concerned with MTBF. They are concerned with unscheduled down days, when the product cannot be used and patients must be rescheduled. So we used our MBTF data to measure the average number of unscheduled-down-days per installed-base-year. Needless say, this number got the attention of the sales team.
As a capacity planning metric, we had been tracking the average number of days we took to respond on-site to customer problem. Since our warranties and service contracts committed us to respond within one day, customers expected us there in one day every time. So we used our average-number-of-days-to-respond data, and created a metric to show the percentage of problems to which we responded within one day. The initial score was not too pretty.
We tracked our preventative maintenance visits (PM’s) per month, and developed a metric for the average-days-PM-outstanding (similar to days-sales-outstanding in accounts receivable, a key sales team metric). Again, this score was less than perfect at the start.
I could go on. We presented our customer-facing metrics together with our internal productivity metrics (e.g. average service calls per rep per day), showing how hard our service team was working. No member of the senior management team could ignore this data. We all knew what it meant. Mike, our service VP, proposed creative ways to quickly expand capacity at minimal cost, and senior management gave their full support. As a company we also expanded our focus on product reliability. We were happy to see our customer-facing metrics improve.
Lessons learned:
Develop your metrics from the customer’s viewpoint. The format of metrics matter. Customer-facing customer-service metrics effectively convey the customer’s view of service-effectiveness throughout the organization. The same data, presented in a non-customer-facing way, simply does not carry the same weight.
Define goals for each metric based on customer requirements. Get senior management buy-in on each goal. It may not be realistic to hit the 1-day response commitment every single time, but why not 97% or 99% of the time? What do your customers expect? Find out, then make it happen.
Report with the right frequency. Candela’s service team was already reporting weekly, so we should have been in good shape. However, not every senior manager read the report each week. So we included the service effectiveness reporting in a quarterly analysis of operations, which went to senior management and the board.
Report graphically. Show the metrics changing over time. We showed the trailing 12 months. Customer-facing metrics should act as an early warning system, enabling the company to take action to maintain high customer satisfaction. Graphs make it easy to see trends.
Add text sparingly to clarify trends. Distinct changes in trendlines are often driven by new product introductions, staffing level changes, increasing sales and other events. Point them out so the reader doesn’t need to guess.
How do you measure your customer service effectiveness?
One thought on “Customer-Facing Metrics for Customer Service Operations”