evidence-based maintenance: how to evaluate the effectiveness of your maintenance strategies

60
EvidenceBased Maintenance How to Evaluate the Effectiveness of your Maintenance Strategies Binseng Wang Clinical Technology Services Clinical Technology Services May 5, 2011

Upload: intermountain-clinical-instrumentation-society

Post on 26-May-2015

3.599 views

Category:

Education


1 download

DESCRIPTION

Binseng Wang, ScD, CCE – Vice President, Performance Management & Regulatory Compliance, ARAMARK Healthcare’s Clinical Technology Services Clinical engineering (CE) professionals have realized for some time that the “preventive maintenance” (PM) that they have been performing for many years is no longer able to prevent any failures, although some safety and performance inspections (SPIs) can help detect hidden and potential failures that affect patient safety. To help CE professionals decide whether they should continue to perform scheduled maintenance (SM) or not, a systematic method for determining maintenance effectiveness has been developed. This method uses a small set of codes to classify failures found during repairs and SM (PMs and SPIs). Analysis of the failure patterns and their effects on patients and users allows CE professionals to compare the effectiveness of different maintenance strategies, and justify changes in strategies, such as decreasing SM, deploying statistical sampling, or even eliminating SM.

TRANSCRIPT

Page 1: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Evidence‐Based MaintenanceHow to Evaluate the Effectiveness of your y

Maintenance Strategies

Binseng WangClinical Technology ServicesClinical Technology Services

May 5, 2011

Page 2: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

What is your definition of PM?

• Preventive Maintenance (or Preventative (Maintenance)

• Predictive Maintenance• Planned Maintenance or Proactive Maintenance • Percussive Maintenance: the fine art of whacking the crap out of an electronic device (or anything else) tocrap out of an electronic device (or anything else) to get it to work again. (Manny Roman, DITEC Ink)

• Percussive Management: the fine art of managing g g gpeople with 2"x4" boards (or whatever else heavy is handy) but not killing them, aka waterboarding.

Censored by HS & HR…

2

Page 3: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

How you currently decide on PM?

• OEM said to do itOEM said to do it

• Joint Commission said to do it (100% for life support & less for non‐ life support)pp )

• Our state licensing code (or CMS rules) require 100% PM on everythingy g

• Even a single injury or death would be unacceptable ‐> total, absolute safety

• That is always what and how we have done it in the last >20‐30 years!

4 Remember the roast beef!

Page 4: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Good News and Bad NewsGood News and Bad News

• Good News

• No significant changes to TJC Med Equip Mgmt standards from 2010

• Even Better News

• CMS accepted TJC standards in lieu of “according to OEM recommendations”recommendations

• Bad News

• Both CMS and TJC are going to scrutinize more carefully g g ymaintenance programs (strategies)

• How do you prove your non‐OEM maintenance strategy is t h t h i ti t f t ?!

5not shortchanging patient safety?!

Page 5: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Table of Contents

• Introduction– How do you convince surveyors that your maintenance program is effective?

• Evidence Based Maintenance• Evidence‐Based Maintenance – Maintenance planning (plan)

– Maintenance implementation (do) Plan

– Maintenance monitoring (check)

– Maintenance improvement (act)

• Discussion and Conclusions

Do

Check

Act

• Discussion and Conclusions– Implementation lessons

– Conclusions

Check

6

– Conclusions

Page 6: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Acknowledgement

• The data presented here were collected by dozens of BMETs at hospitals managed by ARAMARK Healthcare under the leaderships of the following Technology Managers:– Jim Fedele– Len Barnett– Tim Huffman, Steve Zellers– Bob Pridgen, Bob Wakefield, Allan Williams– Chad Granade– Bobby Stephenson– Dana Lesueur

Steve Cunningham– Steve Cunningham– Bob Helfrich– Scott Newman– Jared Koslosky

7

Jared Koslosky

Page 7: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

REFERENCE

• B. Wang, E. Furst, T. Cohen, O.R. Keil, M. Ridgway, R. Stiefel, Medical Equipment Management Strategies, Biomed Instrum & Techn, May/June 2006, 40:233‐237 

• B. Wang, Evidence‐Based Maintenance, 24x7 magazine, April 2007 

• B. Wang, Evidence‐Based Medical Equipment Maintenance Management, in L. Atles (ed.), A Practicum for Biomedical Technology & Management Issues, Kendall‐Hunt, 2008

• M. Ridgway, Optimizing Our PM Programs, Biomed Instrum & Techn, May/June 2009, 244‐254

• M. Rigway, L.R. Atles & A. Subhan, Reducing Equipment Downtime: A New Line of Attack, J Clin Eng, 34:200‐

8204, 2009

Page 8: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Related PublicationsRelated Publications

• Wang B, Fedele J, Pridgen B, Rui T, Barnett L, Granade C, g , , g , , , ,Helfrich R, Stephenson B, Lesueur D, Huffman T, Wakefield JR, Hertzler LW & Poplin B. Evidence‐Based Maintenance: I ‐Measuring maintenance effectiveness with failure codes JMeasuring maintenance effectiveness with failure codes, J Clin Eng, July‐Sept 2010, 35:132‐144.

• Wang  et al. Evidence‐Based Maintenance: II ‐ Comparing maintenance strategies using failure codes, J. Clin. Eng., Oct‐Dec 2010, 35:223‐230

• Wang et al Evidence Based Maintenance: III Enhancing• Wang  et al. Evidence‐Based Maintenance: III ‐ Enhancing patient safety using failure code analysis , J. Clin. Eng., Apr‐June 2011, 36:72‐84

9

Page 9: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

How do you convince surveyors that ff ?your maintenance program is effective?

• Adopted “risk”‐based inclusion criteriaAdopted  risk based inclusion criteria– Good intentions (plans) do not guarantee good outcomes

• PM completion per TJC requirements– Most “PMs” do not prevent failures but only find failures that already 

occurred.  Process ≠ outcome.

• Fast repair turnaround timep– Depending on mission criticality and the availability of back‐ups, some 

failures and turnaround times are NOT acceptable to users

• Repeat work orders < certain threshold• Repeat work orders < certain threshold– Reasonable threshold depends on the type of failure

• Failed PMs < certain threshold10 – idem

Page 10: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

How do you convince surveyors that ff ?your maintenance program is effective?

• Adopted “risk”‐based inclusion criteriaAdopted  risk based inclusion criteria– Good intentions (plans) do not guarantee good results (outcomes)

• PM completion per TJC requirements– Most “PMs” do not prevent failures but only find failures that already 

occurred.  Process ≠ outcome.

• Fast repair turnaround timep– Depending on mission criticality and the availability of back‐ups, some 

failures and turnaround times are NOT acceptable to users

• Repeat work orders < certain threshold• Repeat work orders < certain threshold– Reasonable threshold depends on the type of failure

• Failed PMs < certain threshold11 – idem

Page 11: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Table of Contents

• Introduction– How do you convince surveyors that your maintenance program is effective?

• Evidence Based Maintenance• Evidence‐Based Maintenance – Maintenance planning (plan)

– Maintenance implementation (do) Plan

– Maintenance monitoring (check)

– Maintenance improvement (act)

• Discussion and Conclusions

Do

Check

Act

• Discussion and Conclusions– Implementation lessons

– Conclusions

Check

12

– Conclusions

Page 12: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Maintenance Monitoring PlanDA t

• Process MeasuresSPI/PM l ti t (TJC)

Do the right thing right!

Do

Check

Act

– SPI/PM completion rates (TJC)– Maintenance logs (CMS)– Repair call response or turn‐

around time

g g gDid you earn your diploma by day-dreaming every day in class (perfect attendance)?around time (perfect attendance)?

13

(Wang et al., CE Benchmarking, JCE, Jan-Mar 2008)

Page 13: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Maintenance Monitoring PlanDA t

• Process MeasuresSPI/PM l ti t (TJC)

Do the right thing right!

Do

Check

Act

– SPI/PM completion rates (TJC)– Maintenance logs (CMS)– Repair call response or turn‐

around time

g g gDid you earn your diploma by day-dreaming every day in class (perfect attendance)?around time

• Outcome/Effectiveness Measures (evidence)

(perfect attendance)?

– Uptime– Global failure rate – Patient incidents (including

“ i ”)“near misses”)– Failure codes– Repeated repairs

Others: MTBF customer14

– Others: MTBF, customer satisfaction, etc.

(Wang et al., CE Benchmarking, JCE, Jan-Mar 2008)

Page 14: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Maintenance Monitoring PlanDA t

• Process MeasuresSPI/PM l ti t (TJC)

Do the right thing right!

Do

Check

Act

– SPI/PM completion rates (TJC)– Maintenance logs (CMS)– Repair call response or turn‐

around time

g g gDid you earn your diploma by day-dreaming every day in class (perfect attendance)?around time

• Outcome/Effectiveness Measures (evidence)

(perfect attendance)?

– Uptime– Global failure rate – Patient incidents (including

“ i ”)“near misses”)– Failure codes– Others: MTBF, customer 

satisfaction etc15

satisfaction, etc.Do the right thing right!

Page 15: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Data from the aviation industry(1968)

16

Page 16: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Maintenance CategoriesFailure patterns Maintenance Strategies

• Proactive maintenance: tasks undertaken before a failure occurs to prevent the equipment from failing.  Proactive maintenance must be technically feasible

d h d i i ll f l f f iland worth doing.  Typically useful for failure patterns A, B and C. 

lure

rate

• Reactive (“default”) maintenance: actions undertaken after a failure has occurred (to restore the equipment 

Fai

to original performance standards).  Typically useful for failure patterns D, E and F.

17

time

Page 17: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure CodesEquipment Failures

MAINTENANCE TYPE

FAILURE CODE

DESCRIPTIONTYPE CODEScheduledmaintenance (SM) including inspection

EF Evident failure, i.e., a problem that can be detected--but was not reported--by the user without running any special tests or usingincluding inspection,

calibration, and preventive maintenance

without running any special tests or using specialized test/measurement equipment.

HF Hidden failure, i.e., a problem that could not be detected by the user unless running a y gspecial test or using specialized test/measurement equipment.

PF Potential failure, i.e., a failure that is either about to occur or in the process of occurring but has not yet caused the equipment to stop working or problems to patients or users.

18

users.NPF No problem found.

Page 18: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure CodesEquipment Failures

MAINTENANCE TYPE

FAILURE CODE

DESCRIPTIONTYPE CODECorrectivemaintenance (CM), including

UPF Unpreventable failure, evident to user, typically caused by normal wear and tear but is unpredictable.

USE Failures induced by use e g abuse abnormal wear( ), grepairs performed for failures detected during SM

USE Failures induced by use, e.g., abuse, abnormal wear & tear, accident, or environment issues. Does NOT include use error (typically no equipment failure)

PPF Preventable and predictable failure, evident to user.

SIF Service-induced failure, i.e., failure induced by corrective or scheduled maintenance that was not properly completed or a part that was replaced and p p y p p phad premature failure (“infant mortality”).

CND Cannot duplicate. Includes use errors. Same as NPF.

FFPM Failure found during PM (to avoid duplication of19

FFPM Failure found during PM (to avoid duplication of codes)

Page 19: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure CodesPeripheral Failures

MAINTENANCE FAILURE DESCRIPTIONTYPE CODECM or SM BATT Battery failure, i.e., battery(ies) failed before

the scheduled replacement time. ACC Accessory (excluding batteries) failures

evident to user, typically caused by normal wear and tear.

NET Failure in or caused by network, while the equipment itself is working without problems. Applicable only to networked equipmentequipment.

NOTE: Any resemblance to prior works by A Subhan, P Thorburn, and M Ridgway is NOT mere coincidence

20and M Ridgway is NOT mere coincidence.

Page 20: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data Collection

HospitalTotal #Staffed 

BedsTotal 

#Equipment Teaching NatureStarting

Date #Work orders p q p gA 161  5,200  Non‐Teaching 9/1/08 12,892 

B 256  2,800  Non‐Teaching 3/1/09 6,265 

C 360  4,500  Non‐Teaching 4/1/09 9,205 

D 415  6,800  Non‐Teaching 10/1/08 18,201 

E 586  9,200  Minor Teaching 11/1/09 12,733 

F 169  3,200  Major Teaching 11/1/09 5,414 

/ /G 159  3,300  Minor Teaching 11/1/09 5,396 

H 193  2,400  Non‐Teaching 2/1/10 3,402 

I 439  6,600  Minor Teaching 8/1/08 17,391 

J 335 5 300 Non Teaching 1/1/08 18 293J 335  5,300  Non‐Teaching 1/1/08 18,293 

K 169  3,000  Minor Teaching 11/1/09 5,616 

L 318  5,500  Minor Teaching 8/1/08 14,762 

M 370 4,700 Non‐Teaching 3/1/09 7,087

21

M 370  4,700  Non Teaching 3/1/09 7,087 

TOTAL 3,930  62,500  136,657 

Page 21: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from a single hospital

• 24 consecutive months of SM data

100%

M

Single Channel Infusion Pumps - SM only(Hospital D - 316 Units)

60%

80%

ty fo

r eac

h SM

40%

ated

pro

babi

lit

Remember the L f L

0%

20%

estim

a

22Law of Large

Numbers!NPF ACC BATT EF HF PF

Page 22: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from a single hospital

• 24 consecutive months of CM data

100%

h C

M

Single Channel Infusion Pumps - CM only(Hospital D - 316Units)

60%

80%

ility

for

eac

h

40%

ated

pro

bab

Remember the L f L

0%

20%

estim

a

23Law of Large

Numbers!CND UPF ACC BATT USE SIF PPF

Page 23: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Annual Failure Probability (AFP)Annual Failure Probability (AFP)

AFP is the probability of finding a particular class ofAFP is the probability of finding a particular class of failure (e.g., HF) during a year, calculated as below:

• SM failure codes (EF, PF & HF):– #codes/#SMs completed

• CM failure codes (UPF, USE, PPF & SIF)# d /#CM l d * ETFR h– #codes/#CMs completed * ETFR, whereETFR = #CMs/year/#units (equipment type failure rate)

• ACC & BATTACC & BATT– Combine SM and CM probabilities as calculated above

• No Fail(ure)24 – No Fail = 1 – sum (all other failure probabilities)

Page 24: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from a single hospital

• Combining SM & CM data ‐> Annual Failure Probability (AFP)g y ( )

80%

100%

Single Channel Infusion Pumps(Hospital D - 316 Units)

60%

80%

er u

nit 10%

40%

timat

ed A

FP p

0%

5%

SIF HF PF PPF

0%

20%Es

SIF HF PF PPF

25No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 25: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from a single hospital

• Comparing AFP from 2 consecutive yearsp g y

80%

100%

Single Channel Infusion Pumps(Hospital D - 316 Units)

60%

80%

ed A

FP p

er u

nit

Year 1Year 2

5%

10%

40%

Estim

ate

0%SIF HF PF PPF

0%

20%

No Fail UPF ACC BATT USE EF SIF HF PF PPF

26

No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 26: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from a single hospital

100%

80%

nit

Vital Signs Monitor(Hospital A - 174 units)

60%

AFP

per u

n

20%

40%

Estim

ated

A

0%No UPF ACC BATT USE EF SIF HF PF PPF

E

27

No Fail

UPF ACC BATT USE EF SIF HF PF PPF

Page 27: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from a single hospital

100%

80%

nit

Portable Patient Monitors(Hospital C - 170 units)

10%

60%

AFP

per u

n

5%

20%

40%

stim

ated

A

0%SIF HF PF PPF

0%

20%

No UPF ACC BATT USE EF SIF HF PF PPF

E

28

No Fail

UPF ACC BATT USE EF SIF HF PF PPF

Page 28: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from multiple hospitals

100%A-3

80%

General Purpose Electrosurgical Unit (ESU) B-18

C-21

D-24

E-21

F-810%

60%

d AF

P pe

r uni

t G-10

H-8

I-25

I-23

K-13

5%

10%

40%

Estim

ated

3

L-37

M-25

mean

0%

SIF HF PF PPF

0%

20%

29No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 29: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from multiple hospitals

100%

El t i Th tC-70

80%

t

Electronic Thermometer D-362E-531G-170H-95I-378

60%

d A

FP p

er u

nit

I-226K-32L-183M-48mean

5%

10%

20%

40%

Estim

ated

0%

SIF HF PF PPF

0%

20%

30No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 30: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Codes Data –Single equipment type from multiple hospitals

100%A-32

80%

Battery-Powered Mon/Pace/Defibrillator B-30

C-42

D-60

E-70

F-2510%

60%

d AF

P pe

r uni

t G-42

H-23

I-81

I-55

K-44

5%

10%

%

40%

Estim

ated

L-52

M-57

mean

0%

SIF HF PF PPF

0%

20%

31No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 31: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Using Failure Codes Data 

• Analyses performed in two ways:Analyses performed in two ways:A. Comparing data obtained using different maintenance 

strategies within each equipment class‐> determine ff feffectiveness of maintenance strategies

B Considering all data for each class of equipmentB. Considering all data for each class of equipment (regardless of maintenance strategy adopted) ‐> evaluating the effectiveness of CE activities, comparing

i i i (SPI/PM i ) i lcurrent activities (SPI/PM, repairs, etc.) versus potential activities (i.e., impact of CE on equipment failures)

32

Page 32: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

A. Maintenance Strategies Comparison

Two ways to compare maintenance strategies:Two ways to compare maintenance strategies:• Data from different sites (lateral comparisons)

– Advantage:  no need to wait for data collection g(assuming the same failure codes are adopted)

– Disadvantage:  there could be differences in / /brand/model and/or accessories, user care, etc.

• Data from same site (longitudinal studies)Advantage: no differences in brand/model and/or– Advantage:  no differences in brand/model and/or accessories, user care, etc.

– Disadvantage: need to wait for data collection33

g

Page 33: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

(Lateral) Comparison of Maintenance Strategies

• Types of Maintenance Strategies adopted at differentTypes of Maintenance Strategies adopted at different site:– F3 ‐ Fixed schedule full service or inspection every 3 months 

– F6 ‐ Fixed schedule full service or inspection every 6 months

– F12 ‐ Fixed schedule full service or inspection every 12 months

– Samp ‐ Statistical sampling– Samp ‐ Statistical sampling– R/R ‐ Repair or replace

34

Page 34: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Battery‐powered defibrillator/monitor/ pacemaker

• Any detectable differences?y80%

F3-80

F6-327

60%

P pe

r uni

t

5%

10%

40%

Estim

ated

AFP

0%SIF HF PF PPF

0%

20%E

35

0%No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 35: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Vital Signs Monitor

• Any detectable differences?

80%

Vital Signs MonitorSamp-147

F12-655

R/R-71

60%

P pe

r uni

t

5%

10%

40%

stim

ated

AFP

0%SIF HF PF PPF

0%

20%Es

360%

No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 36: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Pulse Oximeters

• Any detectable differences?

80%

100%Pulse Oximeter

Samp-149

F12-464

R/R-206

60%

80%

P pe

r uni

t

5%

10%

40%

stim

ated

AFP

0%

5%

SIF HF PF PPF

0%

20%

E SIF HF PF PPF

370%

No FailUPF ACC BATT USE EF SIF HF PF PPF

Page 37: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Sequential & Intermittent Compression Devices

• Any detectable differences?80%

Sequential & Intermittent Compression Devices Samp-278

F12-722

60%

P pe

r uni

t

5%

10%

40%

stim

ated

AFP

0%SIF HF PF PPF

0%

20%Es

380%

No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 38: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Single‐channel infusion pumps

• Any detectable differences?y

80%Single-Channel Infusion Pumps Samp-542

F12-1150

60%

P pe

r uni

t

5%

10%

40%

stim

ated

AFP

0%SIF HF PF PPF

20%Es

390%

No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 39: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Radiant Infant Warmers

• Any detectable differences?

80%

100%Radiant Infant Warmer F6-69

F12-91

Samp 19

60%

80%

P pe

r uni

t

Samp-19

%

10%

40%

Estim

ated

AFP

0%

5%

SIF HF PF PPF

%

20%

E SIF HF PF PPF

400%

No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 40: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Electronic ThermometersElectronic Thermometers

• Any detectable differences?y

100%

Electronic ThermometerF12 231

60%

80%

per u

nit

F12‐231

R/R‐1862

10%

40%

Estim

ated

AFP

5%

0%

20%0%

SIF HF PF PPF

41

%No Fail UPF ACC BATT USE EF SIF HF PF PPF

Page 41: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Answer to Surveyor QuestionAnswer to Surveyor Question

• How do you prove your non‐OEM maintenanceHow do you prove your non OEM maintenance strategy is not shortchanging patient safety?!

• Compare AFPDs between “in according to OEM recommendation” and “my maintenance strategy”:– No difference (difference < SD): I should be allowed to use “my maintenance strategy”

– Difference found: change maintenance strategy d it i M i t I tand monitor again => Maintenance Improvement

• In general, statistical sampling is preferable to Repair/Replace (“run to failure”) as you can monitor trends instead of waiting 

42

( ) y gfor annual reviews.

Page 42: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Table of Contents

• Introduction– How do you convince surveyors that your maintenance program is effective?

• Evidence Based Maintenance• Evidence‐Based Maintenance – Maintenance planning (plan)

– Maintenance implementation (do) Plan

– Maintenance monitoring (check)

– Maintenance improvement (act)

• Discussion and Conclusions

Do

Check

Act

• Discussion and Conclusions– Implementation lessons

– Conclusions

Check

43

– Conclusions

Page 43: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Maintenance Improvement 

• Maintenance Revision & Continual Improvement – Inventory classification revision

– SM frequency revision

– Work instruction (tasks) revision 

while continuing to monitor effectiveness (evidence) and efficiency usingefficiency using

– Uptime

– Failure rate 

( “ ”)

Plan

DoAct– Patient incidents (including “near misses”)

– Failure codes

– Others: MTBF, customer satisfaction, etc.Check

44– Financial indicators

Page 44: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

B. Evaluation of CE Activities

Failure Code CE Responsibility Action Class

Grouping of failure codes by CE action

NPF none None or reviewUPF advise Purchasing FUTURE

ACC guide users and PurchasingACC guide users and Purchasing

INDIRECT

BATT guide users and Purchasing

NET work with IT

ALLUSE guide users and Facilities

EF guide usersSIF educate staff and advise OEMsSIF educate staff and advise OEMs

DIRECTHF review SM program

PF review SM program

45PPF review SM program

Page 45: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Battery‐powered defibrillator/monitor/ pacemaker100%

80%

unit

Battery-Powered Mon/Pace/Defibrillator

10%

40%

60%

ated

AFP

per

0%

5%

20%

Estim SIF HF PF PPF

0%No Fail UPF ACC BATT USE EF SIF HF PF PPF

46 CE indirect CE directCE future

Page 46: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Failure Code Grouping ResultsFailure Code Grouping Results

Direct

Battery-Powered Mon/Pace/DefibrillatorDirect

Vital Signs Monitors

Indirect28%

Direct2%

No Failure35%

Indirect

Direct2%

No Failure61%Future

9% Future16%

Indirect47%

Indirect22%

Direct1%

Pulse Oximeters

No Failure17%

Direct3%

Single-Channel Infusion Pumps

No Failure

Future6%

22%

Future24%

Indirect56%

4771%

Page 47: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Using the Risk‐Management Approach to Determine Impact

• Risk is defined as “The combination of theRisk is defined as  The combination of the probability of occurrence of harm and the severity of that harm ” (ISO/IEC Guideseverity of that harm.   (ISO/IEC Guide 51:1999 and ISO 14971:2007)

• Calculated risk = probability * severity [of harm]harm]

The “risk-based criteria” should actually be called “severity-based criteria,” d t th l k f b bilit !

48due to the lack of probability !

Page 48: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Estimation of Risk

• Estimation of the Probability of Harm– A very exaggerated estimate of the probability is the APFDof the probability is the APFD (because it ignores other protective 

mechanisms)

• Estimation of the Severity of Harm– The severity is assigned between– The severity is assigned between 0% and 100%, depending on the impact on patient (no harm ‐

49 death)Figure adapted from Reason (2000), Duke Univ. MC

patientsafetyed.duhs.duke.edu/module_e/swiss_cheese.html

Page 49: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Fennigkoh & Smith ModelFunct Mainten

zed

Equipment Type #Hospitals #Units #WOs ion "Risk" ance EMAnesthesia machine 7 152 767 10 5 5 20Neonatal ventilator 3 28 79 10 5 5 20Portable ventilator 3 60 226 10 5 5 20

Analyz

Volume ventilator 3 50 180 10 5 5 20Batt-pow mon/pace/defibrillator 7 407 1567 10 5 4 19PCA pump 7 430 700 9 5 4 18Syringe infusion pump 5 251 438 9 4 4 17

Type

s  y g p pMulti-channel infusion pump 5 256 498 9 4 4 17Single-channel infusion pump 6 1692 4175 9 4 4 17ESU, general purpose 7 164 411 9 4 3 16Blood warmer, circ. fluid 4 56 212 9 3 3 15

men

t T

,Enteral feeding pump 8 301 488 8 4 3 15Physiological monitoring system 5 286 280 7 4 3 14Ultrasound scanner, generic 5 59 245 6 3 5 14Seq & interm compression dev 7 1000 1287 8 4 2 14

Equip

q pVital signs monitor 7 872 1921 6 3 3 12Pulse oximeter 6 818 840 6 3 2 11NIBP monitor 6 223 403 6 3 2 11Infant scale 8 159 175 2 3 2 7

50Infant warmer 7 179 448 2 3 2 7Blanket warmer 6 157 164 2 1 2 5Patient scale, floor model 6 314 330 2 1 1 4

Page 50: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Estimated Annual Failure Probability F&SEquipment Type FUTURE INDIRECT DIRECT ALL EM

yq p yp

Neonatal ventilator 23.6% 16.9% 11.1% 51.6% 20Physiological monitoring system 13.1% 22.7% 9.3% 45.1% 14Volume ventilator 43.4% 18.7% 9.0% 71.1% 20Blood warmer, circ. fluid 1.2% 5.6% 5.8% 12.6% 15

bility ,

Anesthesia machine 29.0% 25.7% 5.3% 60.0% 20Portable ventilator 27.0% 31.9% 5.3% 64.2% 20Single-channel infusion pump 24.4% 55.6% 2.7% 82.7% 17Syringe infusion pump 12.4% 11.4% 2.7% 26.5% 17

robab y g p p

PCA pump 11.8% 17.8% 2.4% 32.0% 18Vital signs monitor 15.8% 47.0% 2.2% 65.0% 12Ultrasound scanner, generic 28.3% 14.7% 2.0% 45.0% 14ESU, general purpose 12.7% 8.1% 2.0% 22.8% 16

Pr

, g p pBatt-pow mon/pace/defibrillator 8.6% 28.3% 1.9% 38.9% 19Infant warmer 19.1% 9.5% 1.8% 30.4% 7NIBP monitor 24.3% 47.2% 1.8% 73.2% 11Infant scale 4.2% 18.8% 1.8% 24.8% 7Enteral feeding pump 8.6% 16.3% 1.5% 26.4% 15Pulse oximeter 5.7% 22.3% 1.5% 29.5% 11Blanket warmer 18.5% 7.6% 1.3% 27.4% 5Patient scale, floor model 7.6% 17.8% 1.1% 26.4% 4

51Seq & interm compression dev 14.1% 18.6% 0.5% 33.2% 14Multi-channel infusion pump 14.7% 26.0% 0.4% 41.1% 17Mean 16.7% 22.2% 3.3% 42.3%Standard deviation 10.0% 13.2% 3.0% 19.4%

Page 51: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Calculated Annual Risk F&SEquipment Type Severity FUTURE INDIRECT DIRECT ALL EMVolume ventilator 100 43 19 9 71 20Volume ventilator 100 43 19 9 71 20Portable ventilator 100 27 32 5 64 20Anesthesia machine 100 29 26 5 60 20Neonatal ventilator 100 24 17 11 52 20Single channel infusion pump 60 15 33 2 50 17

k

Single-channel infusion pump 60 15 33 2 50 17Batt-pow mon/pace/defibrillator 90 8 25 2 35 19Physiological monitoring system 70 9 16 7 32 14NIBP monitor 40 10 19 1 29 11PCA pump 90 11 16 2 29 18

d Risk PCA pump 90 11 16 2 29 18

Multi-channel infusion pump 70 10 18 0 29 17Vital signs monitor 40 6 19 1 26 12Ultrasound scanner, generic 50 14 7 1 23 14Syringe infusion pump 80 10 9 2 21 17

lated Syringe infusion pump 80 10 9 2 21 17

Infant scale 80 3 15 1 20 7Infant warmer 50 10 5 1 15 7Pulse oximeter 50 3 11 1 15 11ESU general purpose 60 8 5 1 14 16

alcul ESU, general purpose 60 8 5 1 14 16

Enteral feeding pump 40 3 7 1 11 15Seq & interm compression dev 30 4 6 0 10 14Blanket warmer 30 6 2 0 8 5blood warmer circ fluid 50 1 3 3 6 15

52 C blood warmer, circ. fluid 50 1 3 3 6 15Patient scale, floor model 20 2 4 0 5 4Mean 11.6 14.2 2.6 28.3Standard deviation 10.5 9.3 3.0 19.5

Page 52: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Mean Values of Probability & RisksMean Values of Probability & Risks

• Why are you chasing the smallest slices if there areWhy are you chasing the smallest slices if there are “low‐hanging fruits” (larger slices) out there?

Direct3%

Mean AFP for 22 Equipment Types

Direct2 6

Mean Annual Risk for 22 Equipment Types

Indirect22%

3%

Future11.6

2.6

No Failure59%Future

16%

Indirect14.2

53

Page 53: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Performance ImprovementPerformance ImprovementFAILURE GROUP

FAILURE TYPE PERFORMANCE IMPROVEMENT ACTIONS

NOT just maintenance improvement

GROUP ACTIONSDirect Service induced failures (SIF) Review and revise maintenance 

program, e.g., increase frequency, add new tasks, and change strategy.

Failures no‐evident to (hidden from) users (HF)Deteriorations in progress that are likely to become  failures – potential failures (PF) Preventable and predictable failures (PPF)

Indirect Accessory failures (ACC) Provide training to users, and feedback to purchasing, and assistance to facility managers

Battery failures (BATT)Network failures (NET) assistance to facility managers 

in reducing power line issues, water and air quality, HVAC, humidity control, etc.

Network failures (NET)Failures induced by abuse, accidents, or environment issues (USE)Failures evident to users but not 

t d (EF) humidity control, etc.reported (EF)Future Unpreventable failure (UPF) Improve selection in future 

acquisitions favoring more reliable products and

54

reliable products and standardization.

54

Page 54: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

CE Impact Analysis ‐ Conclusions

• CE Impact is reaching its limits., i.e., significant investment of p g gresources are needed for small gains in reducing risks.

• However, much higher impact (reduction of risks) can be achieved by broadening the horizon and helping users,achieved by broadening the horizon and helping users, Facilities, and Purchasing. ‐> i.e., should NOT focus solely on what CE can do (i.e., SM).

• The NIBP monitor example shows that the old myth of zero• The NIBP monitor example shows that the old myth of zero (negligible) “PM yield” needs to be abandoned.  Need to consider the frequency and the severity of all the failures (ALLrisk) not j st those managed b CErisk), not just those managed by CE. 

• In essence, – Reach out of your comfort zone (maintenance) to bring more impact 

to patient care/risk using your expertise!55

to patient care/risk using your expertise!

Page 55: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Table of Contents

• Introduction– How do you convince surveyors that your maintenance program is effective?

• Evidence Based Maintenance• Evidence‐Based Maintenance – Maintenance planning (plan)

– Maintenance implementation (do) Plan

– Maintenance monitoring (check)

– Maintenance improvement (act)

• Discussion and Conclusions

Do

Check

Act

• Discussion and Conclusions– Implementation lessons

– Conclusions

Check

56

– Conclusions

Page 56: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Implementation Lessons (aka how we made it work)

• Put failures codes at the top of selectablePut failures codes at the top of selectable choices (e.g., by adding numbers to the front of the codes so the “float” to the top: 1NPF)of the codes, so the  float  to the top: 1NPF).

• Encourage staff to discuss questionable codes and HF with manager to ensure codingand HF with manager to ensure coding accuracy.

• Monthly verification and corrections:• Monthly verification and corrections:– Missing codes (work orders without codes)– Logically‐wrong codes (e g HF in repairs)

57– Logically‐wrong codes (e.g., HF in repairs)

Page 57: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Conclusions

• Clinical Engineering must evolve together with healthcare– Follow progress of medical equipment design and manufacturing (JC 10 year root‐cause‐analysis (RCA) of sentinel events indicate most ofmanufacturing (JC 10 year root cause analysis (RCA) of sentinel events indicate most of them are due to use errors and communication problems)

– Incorporate the mission‐criticality concept– Adopt the separation of risk and maintenance needs (highAdopt the separation of risk and maintenance needs (high risk ≠ high maintenance but low incidence of failed SM ≠ no SM needed)

– Learn from Reliability‐Centered Maintenance (RCM)Learn from Reliability Centered Maintenance (RCM) experience accumulated in industrial maintenance (but not fully adopting it)Progress from subjective intuitive craftsmanship to

58– Progress from subjective, intuitive craftsmanship to scientific, evidence‐based engineering 

Page 58: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Conclusions2

• Refocus resources from “scheduled 

2

maintenance” – SM (SPIs and PMs) to higher‐impact tasks, e.g., use error tracking, “self‐identified” failures and repairs (“rounding”)identified” failures and repairs (“rounding”), user training, and working with Facilities and Purchasing.Purchasing.

• It is always a balancing act: – Needs (mission, safety, revenue, etc.)

Re$ource$ (human technical financial etc )– Re$ource$ (human, technical, financial, etc.)

(that’s why it is engineering: find the best “balanced”

)59 solution)

Page 59: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

Bottom LinePlan

DoAct

• Evidence‐based Maintenance (EBMaint) allow us to prove to 

Do

Check

Act

CMS and TJC that we are NOT shortchanging patient safety when we deviate from OEM recommendations (effectiveness) .

• EBMaint allows us to move beyond complying with CMS requirements and TJC standards and enhance user satisfaction and patient safety.satisfaction and patient safety.

• EBMaint motivates us to continually review and improve equipment maintenance strategies.EBM i l h l h h l h i i• EBMaint also helps to prove to the healthcare organizations that we are using their limited resources in the most productive manner (efficiency) 

60

Page 60: Evidence-Based Maintenance: How to Evaluate the Effectiveness of your Maintenance Strategies

THANK YOU!

• Please contact us if you have any questions or suggestions

Binseng Wang, ScD, CCE, fAIMBE, fACCE• Vice President, Performance Mgmt & Regulatory Compliance

• Telephone: 704‐948‐5729

• Email: wang‐[email protected]

61