Emergency Management as a Career

I was recently asked if I would post some information assembled for the Arizona State University (ASU) online emergency management program for those who might want to know more about emergency management careers.  The infographic they sent provides a great snapshot of emergency management career opportunities.

From their outreach office: “When different agencies work together to manage a disaster, someone has to coordinate their activities to ensure things run smoothly.  This is where emergency management specialists are crucial.  From helping reduce vulnerability to hazards to helping communities manage disasters effectively.  Below is a look at emergency management as a career.” Information on their program can be found at https://asuonline.asu.edu/.



Studying Strange Disasters – The Boston Molasses Flood


Aftermath of the Boston Molasses Flood.  Source: Boston Public Library

Anyone who has been in the trade of emergency management will likely tell you to always expect the unexpected.  No two disasters are ever the same.  While we can predict similarities from one flood, fire, or hurricane to another, there are always different impacts, needs, and circumstances which often give us cause to consider different means and methods in our response.  Some disasters are noted for specific uniquenesses in their impacts, needs, or circumstances which tend to be a theme of sorts for that disaster.  Every once in a while, however, a disaster occurs which was largely unexpected.

99 years ago this month, the City of Boston encountered one of those unexpected disasters when a flood of over two million gallons of molasses rushed through several blocks of Boson’s North End, killing 21 people and several horses, injuring 150 people, and destroying numerous buildings.  The molasses took weeks to clean and the cause and origin investigation took years, with a final ruling against the company which owned the massive storage tank being found liable.

While I had originally intended to write more about the incident in-depth, I think it most prudent to steer readers toward some of the sources I had looked at, as the information is quite interesting.

While an incident like this seems so unlikely as to never occur again, never say never.  In 2013 over 200,000 gallons of molasses was spilled into Honolulu Harbor.  While no people were killed or buildings destroyed from this pipe leak, the fish kill in the harbor was massive.

And yes, even a beer spill can be hazardous.  In 1814 several tanks containing over 300,000 gallons of beer ruptured in London.  The tidal wave of ale damaged and destroyed several structures and killed 8 people, aged 3 to 63.

We often think about hazardous materials as only being volatile chemicals which can ignite or cause harmful, noxious fumes.  We must consider that any substance in sufficient quantity introduced into a space where it’s not supposed to be can be extremely hazardous, both to people and the environment.  A flood is the most fundamental of these… I don’t think we need to detail the threat and impacts from flood waters.  But as you assess hazards in your community, consider that bulk storage of things like milk, grains, or other materials, which we often don’t consider hazardous, can cause great impact should they be unleashed on people, infrastructure, and the environment.  While our safety regulations (a mitigation measure) are certainly stronger than those which were in place in the 1800s and early 1900s, the hazards still exist.  Be smart and don’t dismiss those hazards outright.

What out of the ordinary hazards concern you?

© 2018 – Timothy M. Riecker, CEDP

Emergency Preparedness Solutions, LLC SM

NIMS Implementation Objectives and A Shot of Reality

Happy 2018 to all my readers!  Thanks for your patience while I took an extended holiday break.  A minor surgery and the flu had sidelined me for a bit, but I’m happy to be back.

This morning, FEMA issued NIMS Alert 01-18: National Engagement for Draft NIMS Implementation Objectives.  NIMS Implementation Objectives were last released in 2009, covering a period of FY 2009-FY2017.  With the release of the updated NIMS last year, FEMA is updating the implementation objectives and has established a national engagement period for their review.

So first, a bit of commentary on this document…

The new objectives are broken out by major content area of the updated NIMS document, including: Resource Management, Command and Coordination, and Communication and Information Management; as well as a General category to cover issues more related to management and administration of the NIMS program.  What we also see with these updated objectives are implementation indicators, which are intended to help ground each objective.  Overall, the number of objectives in this update has been cut in half from the 2009 version (28 objectives vs 14 objectives).

All in all, these objectives appear to be consistent with the current state of NIMS implementation across the nation.  They are certainly suitable for most matters in regard to the oversight of implementing NIMS and it’s various components.  The biggest sticking point for me is that this document is intended for use by states, tribal governments, and territories.  If the goal is to have a cohesive national approach to implementation, I’d like to know what the implementation objectives are for FEMA/DHS and how they compliment those included in this document.

Objectives 8 through 11 are really the crux of this document.  They are intended to examine the application of NIMS in an incident.  These objectives and their corresponding indicators (which are largely shared among these objectives) are the measure by which success will ultimately be determined.  While it’s a good start for these to exist, jurisdictions must be more open to criticism in their implementations of NIMS and ICS.  In addition, there should be an improved mechanism for assessing the application of NIMS and ICS.  While formal evaluations occur for exercises under the HSEEP model, we tend to see inconsistent application of the feedback and improvement activities to correct deficiencies.  Proper evaluations of incidents, especially at the local level, are often not performed or performed well. For those that are, the same issue of feedback and improvement often stands.

Extending this discussion into reality…

The reality is that many responders are still getting it wrong.  Last year my company conducted and evaluated dozens of exercises.  Rarely did we see consistently good performance as far as NIMS and ICS are concerned.  There are several links in this chain that have to hold firm.  Here’s how I view it:

First, the right people need to be identified for key roles.  Not everyone is suited for a job in public safety or emergency management in the broadest sense.  Organizations need to not set up individuals and their own organization for failure by putting the wrong person in a job.  If a certain job is expected to have an emergency response role, there must be certain additional qualifications and expectations that are met.  Further, if someone is expected to take on a leadership role in an ICS modeled organization during an incident, there are additional expectations.

Next, quality training is needed.  I wrote a couple years ago about how ICS Training Sucks.  It still does.  Nothing has changed.  We can’t expect people to perform if they have been poorly trained.  That training extends from the classroom into implementation, so we can’t expect someone to perform to standards immediately following a training course.  There is simply too much going on during a disaster for a newbie to process.  People need to be mentored.  Yes, there is a formal system for Qualification and Certification in ICS, but this is for proper incident management teams, something most local jurisdictions aren’t able to put together.

Related to this last point, I think we need a new brand of exercise.  One that more instructional where trainees are mentored and provided immediate and relevant feedback instead of having to wait for an AAR which likely won’t provide them with feedback at the individual level anyway.  The exercise methodology we usually see applied calls for players to do their thing: right, wrong, or otherwise; then read about it weeks later in an AAR.  There isn’t much learning that takes place.  In fact, when players are allowed to do something incorrectly and aren’t corrected on the spot, this is a form of negative reinforcement – not just for that individual, but also for others; especially with how interrelated the roles and responsibilities within an ICS organization are.

While I’m all for allowing performers to discover their own mistakes and I certainly recognize that there exist multiple ways to skin the proverbial cat (no animals were harmed in the writing of this blog), this is really done best at a higher taxonomy level.  Many people I see implementing portions of ICS simply aren’t there yet.  They don’t have the experience to help them recognize when something is wrong.

As I’ve said before, this isn’t a school yard game of kickball.  Lives are at stake.  We can do better.  We MUST do better.

As always, thoughts are certainly appreciated.

© 2018 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC SM



Project Responder and DHS’ Inability to Follow Standards

I was recently made aware of Project Responder, a publication sponsored by the DHS Science and Technology Directorate, which examines emergency response capability needs within the scope of current operational requirements, threats, and, hazards; with an ultimate focus on the identification of needs an correlating these with technological fixes.  The project description states that ‘the findings from the project can inform the US Department of Homeland Security’s decisions about investments in projects and programs to promote capability enhancement…’.  Project Responder 5 was published in August of this year.  Prior to this edition, I’ve not been familiar with the project, which started in early 2001.

The executive summary of the document states that ‘the document describes 37 capability needs identified by emergency responders…’ <record scratch>.  Hold on a moment… I thought DHS defined 32 Core Capabilities.  Yep, they still do.  The first page of Project Responder 5 includes a foot note that states ‘For purposes of this document, a capability is defined as “the means to accomplish one or more tasks under specific conditions”’.  So in other words, DHS can’t follow it’s own standards.  In many of my articles I’ve regularly remarked about the continual need to streamline our emergency management processes so we can make easier comparisons between these processes, efforts, and activities without having to establish cross walks or translations.  By working from the same standards, we can move easily move between mission areas, which don’t always have boldly marked lines between them, and have an ability to define target results and measure progress.  The Core Capabilities established by the National Preparedness Goal go a long way toward accomplishing this standardization.  It seems the folks in the Science and Technology Directorate don’t think they are that important, and this infuriates me.

The document outlines the 37 capability needs within nine capability domains.  These are:

  • Risk Assessment and Planning
  • Communication and Information Sharing
  • Command, Control, and Coordination
  • Training and Exercise
  • Responder Health and Safety
  • Intelligence and Investigation
  • Logistics and Resource Management
  • Casualty Management
  • Situational Awareness

Some of these appear to have direct correlation to some of what we know as the 32 Core Capabilities, while others seem to combine, redefine, or create new ones.  As the gaps within each domain are discussed, they reference applicable standards.  Interestingly enough, certain standards which you would expect to see aren’t present, such as NIMS being referenced in the Command, Control, and Coordination capability; and HSEEP referenced in the Training and Exercise capability.  Regardless of what technology applications are used to support these areas, these standards are fundamental.

It’s not that the data and analysis that comes out of Project Responder is entirely bad.  It isn’t.  But it’s not great either.  It seems to fall short consistently throughout the document.  The information also needs to be organized within the current lexicon, allowing the reader to make direct correlations to what we are familiar with.  I’m guessing that the project team who did the research and pulled the document together actually knows very little about emergency management or homeland security.  Their inability to communicate context and work within established standards seems to demonstrate this.  It’s fine that the document has a focus on technology implementations that can address gaps, but the fundamentals within the field of practice can’t be ignored.  I don’t see why this project could not have been conducted within the established industry standards.

Perhaps I’ve given a more soap-boxish post than I usually do.  I’m frustrated to see so much wasted time, effort, and dollars in something that could have been more impactful.  Please take a look through the document and let me know what your impressions are.  Also, if you happen to have any insight on this publication which I have missed or am not aware, I’d love to hear it.

Thanks for reading and be safe this holiday season.

© 2017 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC

2017 National Preparedness Report – A Review

With my travel schedule, I missed the (late) release of the 2017 National Preparedness Report (NPR) in mid-October.  Foundationally, the findings of the 2017 report show little change from the 2016 report.  If you are interested in comparing, you can find my review of the 2016 NPR here.

The 2017 NPR, on the positive side, provided more data and more meaningful data than its predecessor.  It appeared to me there was more time and effort spent in analysis of this data.  If you aren’t familiar with the premise of the NPR, the report is a compilation of data obtained from State Preparedness Reports (SPRs) submitted by states, territories, and UASI-funded regions; so the NPR, fundamentally, should be a reflection of what was submitted by these jurisdictions and regions – for the better or worse of it.  The SPR asks jurisdictions to provide an honest analysis of each of the core capabilities through the POETE capability elements (Planning, Organizing, Equipping, Training, and Exercising).

From the perspective of the jurisdictions, no one wants to look bad.  Not to say that any jurisdiction has lied, but certainly agendas can sway subjective assessments.  Jurisdictions want to show that grant money is being spent effectively (with the hopes of obtaining more), but not with such terrific results that anyone would think they don’t need more.  Over the past few years the SPRs, I believe, have started to normalize and better reflect reality.  I think the authors of the NPR have also come to look at the data they receive a little more carefully and word the NPR to reflect this reality.

The 2017 NPR (which evaluates 2016 data from jurisdictions) identified five core capabilities the nation needs to sustain.  These are:

  • Environmental Response/Health and Safety
  • Intelligence and Information Sharing
  • Operational Communications
  • Operational Coordination
  • Planning

I’m reasonably comfortable with the first two, although they both deal with hazards and details that change regularly, so keeping on top of them is critical.  Its interesting that Operational Communication is rated so high, yet is so commonly seen as a top area for improvement on after-action reports of exercises, events, and incidents.  To me, the evidence doesn’t support the conclusion in regard to this core capability.  Operational Coordination and Planning both give me some significant concern.

First, in regard to Operational Coordination, I continue to have a great deal of concern in the ability of responders (in the broadest definitions) to effectively implement the Incident Command System (ICS).  While the implementation of ICS doesn’t comprise all of this core capability, it certainly is a great deal of it.  I think there is more room for improvement than the NPR would indicate.  For example, in a recent exercise I supported, the local emergency manager determined there would be a unified command with him holding ‘overall command’.  Unfortunately, these false interpretations of ICS are endemic.

I believe the Planning core capability is in a similar state inadequacy.  Preparedness lies, fundamentally, on proper planning and the assessments that support it. While I’ve pontificated at length about the inadequacy of ICS training, I’ve seen far too many plans with gaps that you could drive a truck through.  I’ve recently exercised a college emergency response plan that provided no details or guidance on critical tasks, such as evacuation of a dormitory and support of the evacuated students.  The plan did a great job of identifying who should be in the EOC, but gave no information on what they should be doing or how they should do it.  The lack of plans that can be operationalized and implemented is staggering.

The NPR identified the top core capabilities to be improved.  There are no surprises in this list:

  • Cybersecurity
  • Economic Recovery
  • Housing
  • Infrastructure Systems
  • Natural and Cultural Resources
  • Supply Chain Integrity and Security

Fortunately, I’m seeing some (but not all) of these core capabilities getting some needed attention, but clearly not enough.  These don’t have simple solutions, so they will take some time.

Page 10 of the NPR provides a graph showing the distribution of FEMA preparedness (non-disaster) grants by core capability for fiscal year 2015.  Planning (approx. $350m) and Operational Coordination (approx. $280m) lead the pack by far.  I’m curious as to what specific activities these dollars are actually being spent on, because my experience shows that it’s not working as well as is being reported.  Certainly there has been some positive direction, but I’m guessing that dollars are being spent on activities that either have negligible impact or actually have a negative impact, such as funding the development of some of the bad plans we’re seeing out there.

I’m curious as to what readers are seeing out in real life.  What capabilities concern you the most?  What capabilities do you see successes in?  Overall, I think everyone agrees that we can do better.  We can also get better and more meaningful reports.  This NPR was a step in the right direction from last year’s, but we need to continue forward progress.

© 2017 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC

Learning From a 10-Year-Old Report

I’ll admit that I’m often dismissive of information, especially in the field of emergency management and homeland security, if it’s over 10 years old.  There is a lot that’s changed in the past 10 years, after all.  But, realistically, for as much as we’ve changed, things have stayed the same.  Arguably, the first decade of this millennium saw much more change in EM/HS than the second decade has, at least so far.  The first decade saw events like 9/11 and Hurricane Katrina.  Yes, there certainly have been major events in this second decade, but none, it seems, were as influential to our field of practice than those in the first decade.

It’s important to reflect upon lessons observed and to examine what lessons we’ve actually learned.  How far have we come in implementing improvements from the 9/11 Report?  What still needs to be accomplished to meet the intent of the Post-Katrina Emergency Management Reform Act (PKEMRA)?  Perhaps when I have some time to devote I’ll review those documents again and look on them reflectively and provide my thoughts here.

Yesterday I received the latest email from DomesticPreparedness.com.  I refer often to their work in my articles.  This weekly brief included an article from one of my favorite authors in this field, John Morton.  I’ve referenced his work in a few of my past articles.  This article, titled The What If Possibility: A Chilling Report, talks about planning for a rogue nuclear attack, the likely lead role the federal government would have to take in response to such an attack (versus a locally-led response), and what the situation would be the day after.  With the threat of North Korean nuclear weapons capability looming, this article was an interesting read and spot-on.  I noticed a problem, though… It referenced Ash Carter as an assistant secretary of defense in the Clinton administration.  While this was true, Carter’s highest office was SecDef under President Obama.  Surely John Morton, with his incredible attention to detail that I’ve come to recognize couldn’t have made this error.

Nope.  No error on his part.  I looked at the date of the article.  June 27, 2007 – over a decade old.  Incredibly, this article is still highly relevant today.  The article does reference the drafting of certain federal plans for nuclear attack.  Plans which I am not privy to, but that must assuredly exist today.  I’m curious as to the model these plans follow, what has been learned from exercising them, and how we might be able to apply elements of these plans to other catastrophic occurrences.

Despite change, so much seems to stay the same. Of course a decade isn’t that long.  Given that emergency management and homeland security are primarily government roles, we have to acknowledge that the (usually necessary) bureaucracy simply doesn’t move that quickly.  Unfortunately, there are things we are far too slow to adopt, not just from a government perspective, but socially.  As a lover of history and sociology, I see lessons observed from the 1900 Galveston hurricane as well as the eruption of Mt Vesuvius in 79 CE.  There is much that history can teach us, if we are willing to listen. Lessons observed, but not learned.

© 2017 – Timothy Riecker

Emergency Preparedness Solutions, LLC

The NIMS Refresh, aka NIMS 3.0

This morning my inbox was inundated with notices from FEMA and from colleagues about the release of the ‘refreshed’ NIMS, which has finally occurred at almost exactly 18 months after the draft of this document was released.  You can find the new document here. 

As I’m reading through the updated document, there are a few things catching my eye:

  • The term ‘center management system’ has apparently been scrapped, thankfully. First of all, there should not be a separate system for managing emergency operations centers (EOCs) and similar facilities.  I’ve seen the greatest success come from an organization model that mirrors ICS.  Second, the acronym CMS is most commonly related to the Centers for Medicare and Medicaid Services, particularly in regard to the CMS rules for healthcare facility preparedness.  (Want to know more about this?  See my article here)
  • Multi-Agency Coordination as a concept is briefly defined and referenced often without being described enough. It’s such an essential concept of incident management, yet it’s being paid very little heed. There is material on a MAC Group, which, while an implementer of multi-agency coordination at a policy level, is not the only multi-agency coordination that takes places within incident management.
  • The final version still uses the term ‘EOC Director’.  This is a term that is fundamentally incorrect when held to ICS doctrine.  Those in charge of facilities in ICS are called managers.  An EOC, even a virtual one, functions as a facility.  Similarly, the EOC analogs to the command staff, should be referred to as ‘management staff’ in an EOC, not command staff.
  • In the draft there were nearly two pages of references to federal EOC-like facilities. It was unnecessary and irrelevant to the document.  Thankfully those references and descriptions were removed.
  • One of my favorite graphics continues to be used! Figure 10 on page 48 is, to me, one of the most meaningful graphics in all of emergency management.  It pays heed to all critical elements in a response and shows the flow of requests and assistance.
  • I’m a big fan of the Essential Elements of Information (EEI) concept included in the Incident Information section of the document. This should serve as a foundation to all situation assessment and size up documents in all public safety disciplines, moving forward.
  • The appendices offer some additional information, but are largely redundant of the core document.

Overall, NIMS 3.0 is a good document to move forward with.  While there are some elements that I don’t necessarily agree with, none of them are damaging to our field of practice.  While NIMS remains our core doctrine for response, what is missing from this document that we saw heavily included in earlier versions was the concept of integrating NIMS into other aspects of emergency management.  Primarily, it is something that must be prepared for.  It simply isn’t enough to include a one-liner in your emergency plans saying that you are using NIMS.  The elements of NIMS, and not just ICS, but things like EOC management, multi-agency coordination, resource management, and joint information management, need to be fully engrained in your plans.  Plans serve as the foundation for preparedness, so what is in our plans must be trained on and exercised in a continuous cycle.  I would have liked to have seen some very apparent reference to the National Preparedness Goal in this document.  Otherwise, it appears to many that these doctrine are unrelated.

Now that the center management system is gone and they were less heavy handed with EOC management concepts, I wonder what that means for training related to EOC management.  The current FEMA curriculum on EOC management is simply horrible (thus why I’ve created EOC management courses for various jurisdictions).

What are your thoughts on the NIMS refresh?  What did they do well?  What did they miss? Was it too safe with too few changes?  Were there other changes needed to improve our coordination of incident management?

As always, thanks for reading!

© 2017 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC