Failed Attempts to Measure NIMS Compliance – How can we get it right?

Yesterday the US Government Accountability Office (GAO) released a report titled Federal Emergency Management Agency: Strengthening Regional Coordination Could Enhance Preparedness Efforts.  I’ve been waiting for a while for the release of this report as I am proud to have been interviewed for it as a subject matter expert.  It’s the second GAO report on emergency management I’ve been involved in through my career.

The end game of this report shows an emphasis for a stronger role of the FEMA regional offices.  The GAO came to this conclusion through two primary discussions, one on grants management, the other on assessing NIMS implementation efforts.  The discussion on how NIMS implementation has thus far been historically measured shows the failures of that system.

When the National Incident Management System (NIMS) was first created as a nation-wide standard in the US via President Bush’s Homeland Security Presidential Directive (HSPD) 5 in 2003, the NIMS Integration Center (NIC) was established to make this happen.  This was a daunting, but not impossible task, involving development of a standard (lucky much of this already existed through similar systems), the creation of a training plan and curricula (again, much of this already existed), and encouraging something called ‘NIMS implementation’ by every level of government and other stakeholders across the nation.  This last part was the really difficult one.

As identified in the GAO report: “HSPD-5 calls for FEMA to (1) establish a mechanism for ensuring ongoing management and maintenance of the NIMS, including regular consultation with other federal departments and agencies and with state and local governments, and (2) develop standards and guidelines for determining whether a state or local entity has adopted NIMS.”

While there was generally no funding directly allocated to NIMS compliance activities for state and local governments, FEMA/DHS associated NIMS compliance as a required activity to be eligible for many of its grant programs.  (So let’s get this straight… If my jurisdiction is struggling to be compliant with NIMS, you will take away the funds which would help me to do so????)  (the actual act of denying funds is something I heard few rumors about, but none actually confirmed).

NIMS compliance was (and continues to be) a self-certification, with little to no effort at the federal level to actually assess compliance.  Annually, each jurisdiction would complete an online assessment tool called NIMSCAST (the NIMS Compliance Assistant Support Tool).  NIMSCAST ran until 2013.

NIMSCAST was a mix of survey type questions… some yes/no, some with qualified answers, and most simply looking for numbers – usually numbers of people trained in each of the ICS courses.  From FEMA’s NIMS website: “The purpose of the NIMS is to provide a common approach for managing incidents.”  How effective do you think the NIMSCAST survey was at gauging progress toward this?  The answer: not very well.  People are good at being busy but not actually accomplishing anything.  It’s not to say that many jurisdictions didn’t make good faith efforts in complying with the NIMS requirements (and thus were dedicated to accomplishing better incident management), but many were pressured and intimidated, ‘pencil whipping’ certain answers, fearing a loss of federal funding.   Even for those will good faith efforts, churning a bunch of people through training courses does not necessarily mean they will implement the system they are trained in.  Implementation of such a system required INTEGRATION through all realms of preparedness and response.  While NIMSCAST certainly provided some measurable results, particularly in terms of the number of people completing ICS courses, that really doesn’t tell us anything about IMPLEMENTATION.  Are jurisdictions actually using NIMS and, if so, how well?  NIMSCAST was a much a show of being busy while not accomplishing anything as some of the activities it measured.  It’s unfortunate that numbers game lasted almost ten years.

In 2014, the NIC (which now stands for the National Integration Center) incorporated NIMS compliance questions into the Unified Reporting Tool (URT), including about a dozen questions into every state’s THIRA and State Preparedness Report submission.  Jurisdictions below states (unless they are Urban Area Security Initiative grant recipients) no longer need to provide any type of certification about their NIMS compliance (unless required by the state).  The questions asked in the URT, which simply check for a NIMS pulse, are even less effective at measuring any type of compliance than NIMSCAST was.

While I am certainly being critical of these efforts, I have and continue to acknowledge how difficult this particular task is.  But there must be a more effective way.  Falling back to my roots in curriculum development, we must identify how we will evaluate learning early in the design process.  The same principal applies here.  If the goal of NIMS is to “provide a common approach to managing incidents”, then how do we measure that?  The only acceptable methodology toward measuring NIMS compliance is one that actually identifies if NIMS has been integrated and implemented.  How do we do that?

The GAO report recommends the evaluation of after action reports (AARs) from incidents, events, and exercises as the ideal methodology for assessing NIMS compliance.  It’s a good idea.  Really, it is.  Did I mention that they interviewed me?

AARs (at least those well written) provide the kinds of information we are looking for.  Does it easily correlate into numbers and metrics?  No.  That’s one of the biggest challenges with using AARs, which are full of narrative.  Another barrier to consider is how AARs are written.  The HSEEP standard for AARs is to focus on core capabilities.  The issue: there is no NIMS core capability.  Reason being that NIMS/ICS encompasses a number of key activities that we accomplish during an incident.  The GAO identified the core capabilities of operational coordination, operational communication, and public information and warning to be the three that have the most association to NIMS activities.

The GAO recommends the assessment of NIMS compliance is best situated with FEMA’s regional offices.  This same recommendation comes from John Fass Morton who authored Next-Generation Homeland Security (follow the link for my review of this book).  Given the depth of analysis these assessments would take to review AAR narratives, the people who are doing these assessments absolutely must have some public safety and/or emergency management experience.  To better enable this measurement (which will help states and local jurisdictions, by the way), there may need to be some modification to the core capabilities and how we write AARs to help us better draw out some of the specific NIMS-related activities.  This, of course, would require several areas within FEMA/DHS to work together… which is something they are becoming better at, so I have faith.

There is plenty of additional discussion to be had regarding the details of all this, but its best we not get ahead of ourselves.  Let’s actually see what will be done to improve how NIMS implementation is assessed.  And don’t forget the crusade to improve ICS training!

What are your thoughts on how best to measure NIMS implementation?  Do you think the evaluation of AARs can assist in this?  At what level do you think this should be done – State, FEMA Regional, or FEMA HQ?

As always, thanks for reading!

© 2016 – Timothy Riecker

Advertisements

One thought on “Failed Attempts to Measure NIMS Compliance – How can we get it right?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s