1 fy02 asa presentation operate emergency communication center presented by: g. borden g. elliott,...

93
1 FY02 ASA Presentation Operate Emergency Communication Center Presented by: G. Borden G. Elliott, G. Harris, L. Martinez, M. Sheelor Office of Research Services National Institutes of Health 18 November 2002

Upload: henry-carter

Post on 31-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

1

FY02 ASA Presentation

Operate Emergency Communication Center

Presented by:

G. Borden G. Elliott, G. Harris, L. Martinez, M. Sheelor

Office of Research ServicesNational Institutes of Health

18 November 2002

2

Table of Contents

Main PresentationASA Template ……………………………….……………………………….4Customer Perspective……………………….……………………………….5

Customer Segmentation …………………….……………………………………6Customer Satisfaction……………………….…………………………………….7Unique Customer Measures………………….…………………………………..8

Internal Business Process Perspective…………………………………….9Service Group Block Diagram…………………………………………………..10Conclusions from Discrete Services Deployment Flowcharts……………….11Process Measures………………………………………………………………..12

Learning and Growth Perspective………………………………………….13Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data……..14Analysis of Readiness Conclusions…………………………………………….15Unique Learning and Growth Measures………………………………………..16

Financial Perspective………………………………………………………..17Unit Cost……………………………………………………………………………18Asset Utilization……………………………………………………………………19Unique Financial Measures..……………………………………………………..20

Conclusions and Recommendations……………………………………….21Conclusions from FY02 ASA..……………………………………………………22Recommendations…………………………………………………………………23

3

Table of Contents

AppendicesPage 2 of your ASA Template

Customer segments graphs

Customer satisfaction graphs

Block diagram

Process maps

Process measures graphs

Learning and Growth graphs

Analysis of Readiness Information

4

ASA TEMPLATE

5

Monitor Closed-Circuit TV's

Customer Value Proposition

Team Leader

OPERATE EMERGENCY COMMUNICATION CENTER

Discrete Services

Customer Intimacy Sustain

Provide the singel source of contact for emergency assistance on campus, quickly directiong calls to the appropriate responding authority, and maintaining the necessary databases to profice crime-related information to authorized users.

Operate Emergency Communication Center

Manage National Crime Information Center (NCIC) Operations

Product Leadership Harvest

G. Borden - ECC

Operational Excellence Growth

Service Strategy

Team Members

Gary Elliott, Cassandra Harris, Louise Martinez, Marisa Sheelor

Service Group: Provide quality and organizational development services

X X

6

What are we about?• We handle over 4,800 telephone calls per month• Responsible for over 32 line on the telephone system• Monitor 17 CCTV’s w/over 120 images of the NIH

campus• Monitor 24 alarm devices and the fire alarm system• Make over 50,000 National Crime Information Center

(FBI – NCIC) inquiries• Criminal and employment Background checks• Analyze stressful situations – 911 calls• The only Federal 911 center in the DC area.• Three shifts

• 3 personnel midnights, 3 personnel mid-shift, and 5 personnel including supervisor on the day shift

• TTY service for the hearing challenged

7

If we are not handling Emergencies, What? Types of Non-Emergency Calls Received• Employee building access• Parking complaints• Ticket complaints• Directions to NIH• Events/Employment information• Complaints on parking meters• Calls for officers, administrative staff, ECC personnel• Lost & Found• Disable d vehicles• Citizens locked out of their vehicles

8

Customer Perspective

9

OUR CUSTOMERS

• ECC’s prime customer is the Police Branch of which we are an integral part

• NOTE: We concentrated on Police/Fire as customers with the idea that the NIH community is the ultimate customer

ECC Prime Customer Base Total Calls June 01 - June 02

Total Calls = 84,693

Fire25%

19,226

Police75%

65,466

10

CCTV Monitoring: DS2

• From May – October 02 – Two incidents• CCTV’s are providing crime deterrence• Increased security & CCTV’s has decreased crime

CCTV Monitoring - May - October 2002

Crime Prevention

0%

Police Branch100% (2 reports)

11

Customer Segmentation The FBI National Crime Information Center (NCIC) 2002

Total Requests = 50,065

Police Branch75%

Crime Prevention

25%

12

FY02 ORS Customer Scorecard Data for the Annual Self Assessments

Service Group 13:

Operate Emergency Communication Center

16 October 2002

Summary Prepared by the Office of Quality Management (OQM)

13

Survey Distribution

Number of Surveys Distributed

Emergency Communication Center 50

Number of Surveys Returned Emergency Communication Center 17

Response Rate 34%

14

Survey RespondentsFY02 Respondents by IC

9

4

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00

1

2

3

4

5

6

7

8

9

10

NIH IC

Num

ber

of R

espo

nden

ts

Data based on 17 respondents

15

Radar ChartFY02 Product/Service Satisfaction Ratings

ORS Index = 8.27

8.40

8.28

8.48

7.42

1.00

4.00

7.00

10.00Cost

Quality

Timeliness

Reliability

Data based on 436 respondents

Service Group Index = 6.53

7.33

6.38

6.63

6.44

1.00

4.00

7.00

10.00Cost

Quality

Timeliness

Reliability

Data based on 17 respondents

Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

16

Radar Chart InterpretationsProduct/Service• Comparison of ORS & ECC Product/Service

Satisfaction ratings• ECC’s overall score is slightly below ORS index (ORS 8.27

vs. ECC 6.53), with ECC’s reliability the lowest of four categories (reliability, cost, timeliness, and quality)

• Lack of formal training may be an issue in providing correct information or response

• Ratings may reflect continuing difficulties in communication between Fire Department and ECC

• ECC cost score is slightly below ORS (7.42 vs. ECC 7.33) however cost in ECC is the highest rated category

• Respondents think ECC is a pretty good value

17

Radar ChartFY02 Customer Service Satisfaction Ratings

Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

ORS Index = 8.55

8.60

8.58 8.54

8.518.51

1.00

4.00

7.00

10.00Availability

Responsiveness

ConvenienceCompetence

Handling ofProblems

Data based on 436 respondents

Service Group Index = 6.81

6.297.06

7.315.82

7.59

1.00

4.00

7.00

10.00Availability

Responsiveness

ConvenienceCompetence

Handling ofProblems

Data based on 17 respondents

18

Comparison of ORS & ECC Customer Service Satisfaction ratings

• Compared to ORS customer service satisfaction ratings, ECC is slightly below (ORS 8.55 vs. ECC 6.81) with ECC’s competence the lowest rating of 5 categories (reliability, handling of problems, responsiveness, convenience, competence).

• The percentage of mistakes is miniscule, but the consequences are major

• The only ways to reduce the number of mistakes further are (a) to provide additional training for employees and (b) to provide more staff – rushed call takers make mistakes

19

Scatter DiagramFY02 Customer Importance and Satisfaction Ratings

Note: The Importance rating scale ranges from 1 - 10 where “1” represents Unimportant and “10” represents Important. The Satisfaction rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding.

1.00

2.00

3.00

4.00

5.00

6.00

7.00

8.00

9.00

10.00

1.00 2.00 3.00 4.00 5.00 6.00 7.00 8.00 9.00 10.00

Satisfaction

Imp

ort

ance

NOT SATISFIED, IMPORTANT

NOT SATISIFIED, NOT IMPORTANT

SATISFIED,IMPORTANT

SATISFIED, NOT IMPORTANT

Data based on 17 respondents

20

Scatter DiagramFY02 Customer Importance and Satisfaction Ratings: A Closer Look

Note: A smaller portion of the chart is shown so that the individual data points can be labeled.

4.00

4.50

5.00

5.50

6.00

6.50

7.00

7.50

8.00

8.50

9.00

9.50

10.00

4.00 4.50 5.00 5.50 6.00 6.50 7.00 7.50 8.00 8.50 9.00 9.50 10.00

Satisfaction

Imp

ort

ance

SATISFIED,IMPORTANT

Cost

Convenience

Responsiveness

AvailabilityHandling of Problems

Competence

Quality

TimelinessReliability

Data based on 17 respondents

21

Scatter Diagram Interpretation

• The Police & Fire Branches believe availability and convenience are very important.

• Cost does not appear to be a major factor for satisfaction.

• It is interesting to note that better handling of problems would contribute greatly to improved customer satisfaction.

• Again, we need to invest in more training and additional staff.

22

These are some of the survey’s comments of what was done particularly well?• One or two dispatchers are competent to handle

duties required. The rest need improvement.• Checking on building alarms in and out of service.• Keeping track of officers is usually done very well.• Good service.• All phases of dispatch and communication services

are performed exceptionally well.• The new supervisor of ECC has improved the quality

of ECC services and relationships with the customers.

23

Returned Survey Comments of what needs to be improved?

• Need to place lead dispatches on all shifts. All of them work day work.• Timing.• ECC needs to follow proper command structure, in other words,

dispatchers need to stop placing themselves in the role of supervisor, which they are not.

• How key dispatch calls.• Better dispatchers, people that understand police/fire/rescue operations.

Dispatchers that speak clear English, dispatchers that are competent.• Quality and competency of ECC staff, more staff, better scheduling so

"NO" police officers have to fill in. Repeaters for al 3 channels for better communications on and off campus.

• The whole system.• Speed/accuracy of NCIC for officers for some dispatchers.• Nothing to complain about at this time.• Better equipment.

24

Other Comments

• All dispatchers need official training before being assigned to dispatch center. Formal training should be conducted by full time trainers, off campus if need be.

• Keep training.• Day crew is very reliability, and responsive but nights and weekends

are very shady and fly by night. Feel can not rely on them.• ECC definitely needs to improve quality and performance of

dispatchers since NIH police and fire respond to life threatening situations!

• Better dispatchers, better training, better quality assurance, and review.• Sent your staff to a ECC that works with real Emergency every day to

get them real training.• The quality of dispatching changes between operations/shifts and by

work load. I believe they all try very hard doing their best.

25

To Summarize: Increased Customer Satisfaction depends on

• Quality – customer satisfaction training• Reliability – work toward “Industry Standards”• Timeliness – constant scenario drills• Competence – better hiring practices• Handling of problems – training issues• More staff – tired and rushed dispatchers

make mistakes

26

Recommendations• Improve pay & benefits to attract high quality

personnel• Have a panel of ECC/Fire personnel interview

prospective recruits the interviewing to improve selection process

• Structure probationary period – 12-week training to screen out less-successful recruits

• Dispatcher Certification• Formal “IN-SERVICE’ program• Training in competence areas

• (i.e., customer service, proper dispatching procedures)

27

Internal Business Process Perspective

28

National Institutes of HealhEmergency Communication Caenter

(ECC)Basic Block Chart

Request forService

Emergency CommunictionCenter Evaluates & Direact

Proper Response

Requesterreacts to

information (ie.,Police/Fire Deptresponse to an

alarm call)

29

Relationship between Service Group and Discrete Services• NIH ECC is the only Federal ECC to establish an

enhanced 911 Emergency Communication Center in the DC area

• ECC is the focal point between the Police, Fire, and NIH community

• ECC answers emergency and non-emergency calls, analyze stressful situations, monitor approx. 130 CCTV’s, monitor Door Alarm Devise, and Fire Alarms.

• ECC utilizes the Federal Bureau of Investigation’s National Crime Information Center (NCIC) for investigation purposes

30

Chief of Police

Deputy Chief ofPolice

CommanderSupport Service

CommanderCommunity

Orented Policing

Crime Prevention7 Crime

Preventionspecialists

Emerg CommCenter

Supervisor

Lead, 1st Relief Lead, 2nd Relief Lead, 3rd Relief

2 Dispatchers 3 Dispatchers 3 Dispatchers

31

• Our Service Group completed 3 deployment flowcharts for 3 discrete services

• We have learned from the deployment flowcharts• Overall view of total process from requester to completed

actions• Shows sequence of events• Each chart shows a strong symbiotic relationship between ECC,

Police & fire (I.e., maintain radio contact for personal safety• Shows area of improved training needs (i.e., obtain relevant

information from requester)• Shows no outstanding signals

Conclusions from Discrete Services Deployment Flowcharts

32

Process Measures

• Process measures for each discrete service• DS1: Operate ECC

• Total calls received in ECC 2002• Average time to dispatch a call• Actual average monthly overtime

• DS2: Monitor CCTV equipment• Relatively new DS measurement – established a

baseline for future measures (2.8 hours 24/7)

• DS3: Respond to NCIC requests• Total number of requests per year• Relatively new DS measurement - established

baseline for future measures

33

Overtime Hours Per Month 2002

61.1858.29

49.71 51.47

42.25

17.2712.82

20.93

0

10

20

30

40

50

60

70

Jan Feb Mar Apr May Jun July Aug

34

Overtime Hours Per Month 2002

61.1858.29

49.71 51.47

42.25

17.2712.82

20.93

0

20

40

60

80

Jan Feb Mar Apr May Jun July Aug

35

36

Average Time To Dispatch A Call in Minutes 2002

4.255

4 4 45 5

4

0123456

Feb Mar Apr May Jun Jul Aug Sept

Min

utes

37

Average Time To Dispatch A Call in Minutes Control Chart

• Within limits: No major signals

38

911 Control Chart

39

84,693

64,290

11,8688,535

0

10,000

20,000

30,000

40,000

50,000

60,000

70,000

80,000

90,000

Total

Non-Emergency

Blue Light P

hone

911 - Emergency

Total Non-EmergencyBlue Light Phone911

Total Calls Received InThe Emergency Communications Center

June 2001 – June 2002Non-Emergency 84,693, BlP 11,868,

Emergency 8,535 Total 84,693

76% Non-Emergency14% Blue Light Phone10% Emergency

40

Process Measure Findings

• Beginning month 40-60 hours • mandated OT because of high security – out side

influence

• Emergency incoming calls (approx. 11% of total calls) are within the control process

• Average time to dispatch a call is between 4-5 minutes • Training issues with the update version of CAD

system

41

FY02 Learning and Growth (L&G) Data for the Annual Self Assessments

Service Group 13:

Operate Emergency Communications Center

26 September 2002

Summary Prepared by the Office of Quality Management

42

Learning and Growth Data Table

About 2 days sick leave per employee

About 1 award for every 2 employees14% employee turnover

1 ER case out of 7 employees

543 7 1 0.14 140 19 4 0.55 0 0.00 1 0.14 0 0.00Service

Group 13 Total

7 1 0.14 140 19 4 0.55 0 0.00 1 0.14 0 0.00

43

Interpreting ECC’s Data

• Group Turnover Rate• ECC is slightly below average

• Average Hours of Sick leave Use• ECC is below average

• Average Number of Awards Received• ECC is slightly below average

• Average Number of EEC Complaints• None

• Average Number of ER Cases• One case in 2001

44

Summary of ECC’ Learning and Growth Data

• Fourteen percent employee turnover• About 2 days of sick leave used per

employee• About one award for every 2 employees• No ECC complaints for the year 2002• Fourteen percent Employee Resolution cases

(actually only one case in 2001)• Average number of ADR Cases - None

45

• Group Turnover Rate• Fourteen percent• Out of 32 SG’s ECC ranks slightly below the 50 percentile

with 50% being the mean

• Average Hours of Sick Leave Used• With 35 hours as an average, ECC is below average with

18% This is an indication that Sick Leave is not being abused

• Average Number of Awards Received• ECC is slightly below average. Improvement is needed• Note: NIH in general needs to improve awards program

Conclusions from Turnover, Sick Leave, Awards, EEO/ER/ADR Data

46

Conclusions Continued• Average Number of EEC Complaints

• None – This indicates an adherence to the rules and policies concerning EEC practices

• Average Number of ER Cases• One case in 2001 which gives 14%• As compared with other SG’s, ECC was slightly above the

the average of .12 (ECC .14)• Note: ECC is in the process of formulating a Union, thus,

another means to address ER issues.

• Average Number of ADR Cases• None – ECC mgt & employees are working to resolve issues

47

Financial Perspective

48

Financial Findings: Overtime• Trend !!!• Until ECC reach “Full” staffing, OT costs will remain a part

of the “normal” operating procedure. • ECC is utilizing approximate 20% OT• This equates to 2.82 FTE’s

• Avg. Salary $38,314 • $38,314 X 11 (number of FTE’s) = $421,450 (straight time)

• Avg. OT hours $27.63 X 904.50 hours (total number of OT used) = $107881

• Straight Sal $421450 & OT Sal 107881 = $529331 (total personnel cost)

• $107881/38314 = 2.82• 2/82 X $38,314 = $107881 • Hiring 3 additional personnel could be a break-even point

49

Asset Utilization Measures

• Activity difficult to measure in terms of standard outputs.

• How do you measure standby?

• Quess-timate 10% non-productive• Through observations

• Asset Utilization 90%• Max input 11X1840 = 20240• Non-productive input 2025 (10%)• Asset utilization 18216/20240• Asset utilization 90%• Note: 10% can be used for additional training needs

50

• The right mix of skills & abilities• Interpersonal skill, communication abilities (I.e., verbal,

written), decision making abilities, technical know-how, analytical skill, ability to multi-task, physical

• In the next three years ECC is expected to expand its digital CCTV to cover off-campus

• There is a need for operational training (i.e. Weapons of Mass Destruction, Handling Bomb Threats, etc)

• The right tools needed to carry out the mission are technological updates (i.e., MAAARS-View – shows location of incoming 911, DIAPHONE – records the into ing call, ANDOVER – UPDATES FOR BUILDING ACDCESS, CCTV)

• At this point, budget concerns come into play

Analysis of Readiness Conclusions: What is Needed

51

Readiness Continued

• What are the anticipated implications of not obtaining the right mix?

• Poor service• Inefficiencies• Liability issues• Possible loss of life and property

52

Conclusions and Recommendations

53

Conclusions from FY02 ASA

ECC process is working but with a squeaky wheel – ECC needs grease because of the following:

• Need for increased pay & benefits to attract qualified personnel

• Demand for service is increasing• Need for additional personnel• Need for formal training of personnel• Need for Technological updates• A substantial safety risk factor exists because of

insufficient trained personnel• Insufficient staffing equals to insufficient service• As a integral part of the Public Safety Branch, without

ECC, security and safety are at risk

54

• Increase pay & benefits to attract qualified personnel• 10% retention pay• Update Job description• Invest in Computer Aid Dispatch and other ECC technical

enhancements and upgrades • Hire additional personnel to:

• meet increasing demands• reduce overtime and risk• meet Congressional directive (FY2000) from USATREX survey

suggestion to increase staffing levels to 16 FTE’s.• Off set “abnormal” use of OT for normal operations

• Train ECC personnel to meet ECC Industry Standards and certifications (i.e., Maryland State)

• Liaison with the Fire Department for better understanding of their needs

• Do a website to inform the NIH community of our services

Recommendations

55

We're Here to Serve You

NIH 9-1-1 what is Your Emergency?

  

9 - The number of days we need in our workweek

 

1 - The number of times we have to get it right.

 

1 - NIH Emergency Communication Center

THE ONE to call

56

Appendices

57

Appendices

• Include the following:

• Page 2 of ASA Template• Customer segments graphs • Customer satisfaction graphs• Block diagram• Process maps• Process measure graphs• Learning and Growth graphs• Analysis of Readiness Information

58

Improve "Interpersonal" relationships with ECC clients

Contacts/Complaints with EEO/ER/ADR

Better understand the nature of emergency calls on campus Number and type of callsNumber and type of NCIC requests

Type of

Ratio: staff to volume of calls/contacts

Increase Internal customer satisfaction - Police/Fire/Rescue

Sick Leave Usage, Awards/Recognition

Analysis of Readiness Index

Performance Objective

Appropriately staff the ECC Number of full-time staff positions filled, Number of FY01 authorized positions filled

Increase 50% of the ECS to MD State Certification.

Performance Measure

Customer PerspectivePerformance Measure

Customer satisfaction ratings from the ORS Customer Scorecard for each Discrete Service

Change in satisfaction ratings (FY01 to FY02) collected via survey, Numbear of suggestions from focus groups in FY01 implemented, Number of suggestions for improvements gathered in focus groups in FY02

Customer segmentation of Discrete Services

Identify ECC industry standards

Performance MeasureInternal Business Process Perspective

Complete process maps of Service Group/Discrete Services

Identify and report on process measures for Discrete Services

Maintain & enhance competencies for the future organization.

Performance Objective

Increase understanding of customer base

Increase customer satisfaction

Set Benchmarks to meet ECC industry Standards

Performance Objective

Service Group:

Sick Leave Usage

Increase understanding of processes.

Enhance quality of work life for employees in ORS.

Identify methods to measure processes.

Learning and Growth Perspective

59

Survey RespondentsFY02 Respondents by IC

9

4

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 00

1

2

3

4

5

6

7

8

9

10

NIH IC

Num

ber

of R

espo

nden

ts

Data based on 17 respondents

60

Radar ChartFY02 Product/Service Satisfaction Ratings

ORS Index = 8.27

8.40

8.28

8.48

7.42

1.00

4.00

7.00

10.00Cost

Quality

Timeliness

Reliability

Data based on 436 respondents

Service Group Index = 6.53

7.33

6.38

6.63

6.44

1.00

4.00

7.00

10.00Cost

Quality

Timeliness

Reliability

Data based on 17 respondents

Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

61

Radar ChartFY02 Customer Service Satisfaction Ratings

Note: The rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding. Refer to the Data Analysis and Graphing training for advice on interpreting these results.

ORS Index = 8.55

8.60

8.58 8.54

8.518.51

1.00

4.00

7.00

10.00Availability

Responsiveness

ConvenienceCompetence

Handling ofProblems

Data based on 436 respondents

Service Group Index = 6.81

6.297.06

7.315.82

7.59

1.00

4.00

7.00

10.00Availability

Responsiveness

ConvenienceCompetence

Handling ofProblems

Data based on 17 respondents

62

Scatter DiagramFY02 Customer Importance and Satisfaction Ratings

Note: The Importance rating scale ranges from 1 - 10 where “1” represents Unimportant and “10” represents Important. The Satisfaction rating scale ranges from 1 - 10 where “1” represents Unsatisfactory and “10” represents Outstanding.

1.00

2.00

3.00

4.00

5.00

6.00

7.00

8.00

9.00

10.00

1.00 2.00 3.00 4.00 5.00 6.00 7.00 8.00 9.00 10.00

Satisfaction

Imp

ort

ance

NOT SATISFIED, IMPORTANT

NOT SATISIFIED, NOT IMPORTANT

SATISFIED,IMPORTANT

SATISFIED, NOT IMPORTANT

Data based on 17 respondents

63

Scatter DiagramFY02 Customer Importance and Satisfaction Ratings: A Closer Look

Note: A smaller portion of the chart is shown so that the individual data points can be labeled.

4.00

4.50

5.00

5.50

6.00

6.50

7.00

7.50

8.00

8.50

9.00

9.50

10.00

4.00 4.50 5.00 5.50 6.00 6.50 7.00 7.50 8.00 8.50 9.00 9.50 10.00

Satisfaction

Imp

ort

ance

SATISFIED,IMPORTANT

Cost

Convenience

Responsiveness

AvailabilityHandling of Problems

Competence

Quality

TimelinessReliability

Data based on 17 respondents

64

Chief of Police

Deputy Chief ofPolice

Commander,Support Service

Commander,Community

Oriented Policing

Crime Prevention7 Crime

PreventionSpecialists

EmergencyCommunication

CenterSupervisor

Lead, 1st Relief Lead, 2nd Relief Lead, 3rd Relief

2 Dispatchers 3 Dispatchers 3 Dispatchers

65

National Institutes of HealhEmergency Communication Caenter

(ECC)Basic Block Chart

Request forService

Emergency CommunictionCenter Evaluates & Direact

Proper Response

Requesterreacts to

information (ie.,Police/Fire Deptresponse to an

alarm call)

66

DS 1: Operate Emergency CommunicationsCenter

ECCPolice/Fire

DeptCustomer

Yes

No

No

Yes

Dispatch Unit?

Is itEmergency?

DissiminateInformation

Maintain contactwith Police/Fireunits & monitor

activities.

Obtain RelavantInfo from

Customer.

Can request beprocess in ECC?

Dispatchappropriate units(Fire or Police)

Evalute Nature ofRequest.

Be Prepared fornext request.

Initiates Request

Make contact,check conditions &welfare of people.Rectify situation.

Transfer CallRequestor

statisfied withinfo?

Document Actionin Log.

No

No

Is situationstable/satified?

Clear Radio Call

Yes

No

Yes

Yes

67

DS 2: Monitor Closed-CircuitTVs

Police/FireDept

ECC

Monitor CCTVSystem

Observe abnormalevent.

Dispatchingunits required?

Dispatchappropriate units(Fire or Police)

Notify properDepartment ofAbnormality.

Make contact,check conditions &welfare of people.Rectify situation.

Maintain contactwith Police/Fireunits & monitor

activities.

Is situationstable/satified?

Clear Radio Call

Continue Monitoring CCTV system

No

Yes

Yes

No

68

DS 3: Manage National Crime Information Center(NCIC)

FBIECC

Customer

Request Info.

Request Valid?

Enter data intoNCIC Computer

NCIC Processes &Returns Info.

Record and LogTransaction

DissiminateInformation

Reject RequestNo

Yes

69

70

Average Time To Dispatch A Call in Minutes Control Chart

• Within limits: No major signals

71

911 Control Chart

72

• The right mix of skills & abilities• Interpersonal skill, communication abilities (I.e., verbal,

written), decision making abilities, technical know-how, analytical skill, ability to multi-task, physical

• In the next three years ECC is expected to expand its digital CCTV to cover off-campus

• There is a need for operational training (i.e. Mass Weapons of Destruction, Handling Bomb Threats, etc)

• The right tools needed to carry out the mission are technological updates (i.e., MAAARS-View – shows location of incoming 911, DIAPHONE – records the into ing call, ANDOVER – UPDATES FOR BUILDING ACDCESS, CCTV)

• At this point, budget concerns come into play

Analysis of Readiness Conclusions: What is Needed

73

Readiness Continued

• What are the anticipated implications of not obtaining the right mix?

• Poor service• Inefficiencies• Liability issues• Possible loss of life and property

74

Methodology• ASA Teams determined best methodology to assess

customer satisfaction

• FY02 methodology reviewed by OQM• Customer segments to be assessed

• Customization of ORS Customer Scorecard instrument

• Description of item to be assessed (e.g., Service Group, Discrete Service, specific product/service)

• Method of survey distribution (e.g., email, hard copy)

• Accompanying Memos/email messages

• Timeline for distribution and return

• Number of surveys to be distributed

• Upon gaining approval, ASA Teams distributed surveys to customers

75

Methodology (cont.)

• Completed surveys were returned to OQM or to ASA Consultant (SAIC)

• Preserve customers’ anonymity

• Ensure the integrity of the results

• Survey data were entered into a database and analyzed

• Results typically summarized at Service Group level

• If sufficient number of completed surveys were returned, may be able to generate analyses for specific products/services

76

FY02 Learning and Growth (L&G) Data for the Annual Self Assessments

Service Group 13:

Operate Emergency Communications Center

26 September 2002

Summary Prepared by the Office of Quality Management

77

Methodology

• All data represent occurrences from Oct 2001 - June 2002

• Data analyzed covered period between October 1st and end of June to provide time to analyze and present the data

• ORS Human Resources (HR) provided data on:• Turnover• Sick leave• Awards

• HR data stored in NIH databases by Standard Administrative Codes (SACs)

• Developed cross-reference of ORS Service Groups to SACs• Almost all SACs assigned to Service Groups • Some Service Groups have identical SACs

• In this case, two Service Groups will receive same set of data

78

Methodology (cont.)

• Also obtained data from: • Equal Employment Opportunity (EEO)

• Number of EEO complaints

• Employee Relations (ER)• Number of ER cases

• Alternative Dispute Resolution (ADR)• ADR cases

79

Interpreting Your Data

• FY02 is the first time L&G data were collected and analyzed• Compare your Service Group relative to the other ORS Service

Groups • What are all the L&G indicators telling you?• In the future your group should compare itself to its own Service

Group data over time

• Interpret data in terms of other ASA data• Customer satisfaction ratings• Process measures• Financial measures

• Does the L&G data, when compared to data in other perspectives, show potential relationship (could L&G be contributing to customer satisfaction results)?

• From reviewing your Service Group’s L&G data, what could be done to improve Quality of Work Life (QOWL)?

80

Service Group Turnover Rate

• Calculated as the number of separations for a Service Group / Population of Service Group

• Separations defined as:• Retirements (separation codes 3010, 3020, 3022)• Resignations (separation codes 3120, 3170)• Removals (separation codes 3300)• Terminations (separation codes 3520, 3550, 3570)• Promotions to new organization (separation codes 7020)• Reassignments (separation code 7210)

• Note that transfers/promotions within ORS Divisions/Offices are not captured by the NIH database

81

Service Group Turnover Rate (cont.)

• Calculation of Service Group population was needed since number of employees changes over time • Population for Service Group was estimated

based on average of employee count at three snapshots in time (Nov 2001, Feb 2002, June 2002)

82

Average Hours of Sick Leave Used

• Calculated as the total number of sick leave hours used for a Service Group / Population of Service Group

83

Average Number of Awards Received

• Calculated as the total number of awards received / Population of Service Group

• Includes both monetary and non-monetary awards• Cash awards• QSIs• Time-off • Honorary• Customer Service

84

Average Number of EEO Complaints

• Calculated the total number of EEO complaints for a Service Group / Population of Service Group

85

Average Number of ER Cases

• Calculated the total number of ER cases for a Service Group / Population of Service Group

• Case is defined as any contact with ER Office where an action occurs (e.g., Letter is prepared)

86

Average Number of ADR Cases

• Calculated the number of ADR cases for a Service Group / Population of Service Group

• Case is initiated when person contacts ADR

87

Learning and Growth Data Table

About 2 days sick leave per employee

About 1 award for every 2 employees14% employee turnover

1 ER case out of 7 employees

Pop

ulat

ion

Est

imat

e

No.

of S

epar

atio

ns

Turn

over

Rat

eTo

tal H

ours

of S

ick

Leav

e U

sed

Ave

rage

Hou

rs o

f

Sic

k Le

ave

Use

d

Num

ber

of A

war

ds

Rec

eive

dA

vera

ge N

umbe

r of

Aw

ards

Rec

eive

d

Num

ber

of E

EO

Com

plai

nts

Ave

rage

Num

ber

of

EE

O C

ompl

aint

s

Num

ber

of E

R C

ases

Ave

rage

Num

ber

of

ER

Cas

esN

umbe

r of

AD

R

Cas

esA

vera

ge N

umbe

r of

AD

R C

ases

543 7 1 0.14 140 19 4 0.55 0 0.00 1 0.14 0 0.00Service

Group 13 Total

7 1 0.14 140 19 4 0.55 0 0.00 1 0.14 0 0.00

88

0.00

0.05

0.10

0.15

0.20

0.25

0.30

0.35

12 16 21 30 27 20 42 13 41 3 24 23 7 18 1 2 10 11 17 31 26 25 40 28 37 29 33 39 5 32 36 38 4 8 9 14 15 19 34 35 43

Service Group Turnover Rate (Oct 2001 - June 2002)

Service Group Number

Tu

rno

ve

r R

ate

89

0

10

20

30

40

50

60

70

80

14 3 9 31 8 17 43 38 36 33 41 12 30 32 28 18 29 21 39 20 16 24 27 40 35 5 23 42 19 4 34 15 37 25 26 1 2 10 11 13 7

Average Hours of Sick Leave Used(Oct 2001 - June 2002)

Service Group Number

Av

era

ge

Ho

urs

90

0.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

4.00

4.50

9 43 1 2 10 11 40 42 7 41 12 4 5 28 39 35 32 33 8 15 36 38 34 37 29 17 14 31 16 30 19 18 13 20 21 3 24 26 25 23 27

Average Number of Awards Received(Oct 2001 - June 2002)

Service Group Number

Av

era

ge

nu

mb

er

91

0.00

0.01

0.02

0.03

0.04

0.05

0.06

0.07

20 8 33 34 7 31 26 25 38 36 32 5 1 2 3 4 9 10 11 12 13 14 15 16 17 18 19 21 23 24 27 28 29 30 35 37 39 40 41 42 43

Average Number of EEO Complaints (Oct 2001 - June 2002)

Service Group Number

A

ve

rag

e N

um

be

r

92

0.00

0.05

0.10

0.15

0.20

0.25

0.30

7 23 16 3 31 13 9 20 25 26 17 36 38 37 29 32 39 1 2 4 8 10 11 12 14 15 18 19 21 27 28 30 33 34 35 40 41 42 43

Average Number of ER Cases (Oct 2001 - June 2002)

Service Group Number

Av

era

ge

Nu

mb

er

93

0.00

0.05

0.10

0.15

0.20

0.25

7 9 30 24 33 34 3 23 26 25 5 20 21 31 29 39 28 32 36 16 38 1 2 4 8 10 11 12 13 14 15 17 18 19 27 35 37 40 41 42 43

Average Number of ADR Cases (Oct 2001 - June 2002)

Service Group Number

Av

era

ge

Nu

mb

er