Power Lines Blog

Time to reform safety benchmarks in public power

Safety Benchmarking

By Mike Hyland, Senior Vice President, Engineering & Operations and Alex Hofmann, Director, Energy & Environmental Services — American Public Power Association

We spend a lot of time here at the American Public Power Association considering how public power utilities can better measure and improve workforce safety. We are deeply involved with the Institute of Electrical and Electronics Engineers’ National Electric Safety Code and want to ensure that the most up-to-date and data-driven safety practices are adopted by our industry. As staff stewards of APPA’s own Safety Manual and its associated Revision Task Force, we feel compelled to help public power continuously improve its safety culture.

Public power utilities take safety seriously, but the safety benchmarks we use, such as the Occupational Health and Safety Administration (OSHA) incidence rate, don’t always work as intended. The OSHA incidence rate can provide a measure of safety performance, but for true benchmarking purposes we need to measure safety over the long term to see if we really are getting better or worse. Unfortunately, the OSHA incidence rate is not effective for long-term measurement.

The OSHA incidence rate was developed to help industries measure themselves and their relative organizational safety by looking at incidents (injuries and illnesses) adjusted by organization. This is effective from a bureaucratic standpoint because it normalizes the incidents-per-100-employees rate for organizations of different sizes. However, problems arise when the metric is used as a benchmark.

Let’s walk through an example. If you’re a small utility and have a reportable illness or injury, your calculated OSHA incidence rate for the year might be huge compared to the rest of industry. However, we’ve also found that most years the OSHA incidence rate for that same small utility is zero. If you’re a large utility and you have the same number of incidents, your OSHA incidence rate is going to be a small fraction of the industry rate and this won’t vary much no matter how good or bad your safety record is.

At a large utility, your OSHA incidence rate will remain small even if a lot more workers have injuries or illnesses than at a small utility. Using the OSHA incidence rate can mislead utilities of all sizes on their safety performance.

To compound the benchmarking problem, there are fundamental differences in reporting. For example, not all utilities classify the same events as injuries or accidents, and not all injuries are the same — a death is not the same as a paper cut (but that’s a topic for another blog). We need to get better at consistently reporting accident and injury types so that we can focus on adapting to the conditions under which accidents happen. Consistent reporting will also help the industry better identify safety trends.

What we’ve found by analyzing APPA’s RP3 (Reliable Public Power Provider) and Safety Awards of Excellence data is that public power utilities have “five nines” of safety, or a 99.999 percent likelihood of not having an accident (reportable injury or illness) in any given hour of exposure. We found that because of this safety record, the OSHA incidence rate doesn’t seem to make much sense for a utility with less than 20 employees. This means more than 50 percent of public power utilities are left with inadequate benchmarking numbers because they are too small. And large utilities are left with an inadequate number of utilities to benchmark against and may have a false sense of safety.

This chart of 2014 utility safety data should help illustrate why the OSHA incidence rate doesn’t really work as a multi-utility safety benchmark. For the smallest utilities (less than 200,000 worker hours of exposure), there are highly visible incident rate curves. Note the OSHA incidence rate variance among the utilities that have had a single incident.

Safety Graph APPA

It gets worse if we try to take the average of the incidence rate across a utility group that is varied in worker hours of exposure. We almost always end up with too many zeros from the small utilities and the large utilities close to the industry standard case-per-1,000-worker-hours-of-exposure rate, no matter how they are performing. The graph illustrates how many people have to be injured at a utility to result in the same OSHA incidence rate, so consider the following:

  • A utility’s hazard-per-exposure-hour rate does not go away in a single zero-incidence-rate year.
  • A steady incidence rate that is lower than average does not indicate better than industry long-term safety performance.
  • Not all injuries and illnesses are equal and we need better and more standardized information on hazards that lead to incidents.
  • Larger utilities probably have better control over short-term safety-related factors, but don’t necessarily have better long-term safety performance. They also have proportionately more lower-hazard work employees (per worker hour), which reduces the effectiveness of the incidence rate metric as a benchmark. Public power utilities have similar numbers of lineworkers-per-customer, but they often have widely varying complements of additional staff (the position of “energy services director” for example would only be seen at a larger public power utility) — this is also a topic for another blog.

To us, the data says we need to start benchmarking over multiple years just as we do with reliability data to measure our safety performance over the long term and (hopefully) improvement from year to year. Only after we’ve done this can we focus on what’s needed next — true standardization of safety-related reporting.

For public power, the average cases per 1,000 hours of worker exposure over the last three years (without zero-incidence-rate year utilities) is 0.044. How does your utility measure up?

Michael Hyland

Michael Hyland

Senior Vice President, Engineering Services

One Comment

Comment

Your email address will not be published. Required fields are marked *