NCIER®
  • Research & Insights
  • Methodology Review: Validation of the Active Shooter Incident Management Checklist
Independent Studies February 23, 2018

Methodology Review: Validation of the Active Shooter Incident Management Checklist

Executive Summary

"Effective job aids are critical cognitive tools designed to mitigate cognitive tunneling. This report details the multi-layered validation methodology applied to the Active Shooter Incident Management Checklist, ensuring its efficacy and usability for incident commanders and first responders."

Key Takeaways

  • Compliance with Stufflebeam's development guidelines (91.67% compliance score).
  • Adherence to Bichelmeyer's formatting best practices for cognitive ergonomics.
  • Conformity with NASA flight-deck typography standards for high-stress legibility.
  • Validated by over 200 first responders in full-scale functional exercises.

1.0 Introduction: The Imperative for a Rigorously Validated Incident Management Tool

In high-stress, dynamic environments such as active shooter incidents, the operational demands on first responders can induce acute stress, leading to performance degradation. Effective job aids are critical cognitive tools designed to mitigate cognitive tunneling and ensure essential tasks are completed systematically. For such a tool to be reliable, it must undergo a comprehensive, multi-layered validation process. The purpose of this document is to detail the methodology applied to validate the Active Shooter Incident Management Checklist, ensuring its efficacy and usability for incident commanders and first responders.

This review will examine the three distinct pillars upon which the checklist’s validation was built, each corresponding to an established professional or academic standard:

  • Compliance with established academic and professional guidelines for checklist development.
  • Adherence to best practices for formatting and usability.
  • Conformity with specialized standards for typography and legibility under pressure.

These compliance audits are complemented by direct user feedback gathered during full-scale, functional training exercises, providing a complete picture of the checklist’s validity. This dual approach provides a robust framework for assessing the tool’s theoretical soundness and its practical effectiveness in the field.

2.0 A Multi-Faceted Validation Framework: Rationale and Approach

The strategic decision to employ a multi-faceted validation framework is central to establishing the credibility of a life-safety tool. Relying on a single validation method is insufficient for a resource intended for use in high-consequence events. This approach, therefore, triangulates evidence from development theory, human factors design principles, and end-user performance to build a comprehensive case for the checklist’s effectiveness.

2.1 Foundational Integrity: Stufflebeam’s Checklist Development Checklist

The foundational layer of validation was an audit against Daniel L. Stufflebeam’s Guidelines for Developing Evaluation Checklists. This framework was selected to ensure that the checklist’s content and creation process were methodologically sound from the outset. This review confirms that the checklist was constructed through a systematic process involving literature review, expert consultation, and a structured review cycle, ensuring its foundational integrity.

2.2 Usability and Formatting: Bichelmeyer’s Checklist for Formatting Checklists

The second validation layer was designed to validate the checklist’s design for minimizing extraneous cognitive load and optimizing cognitive offloading. Using Barbara Bichelmeyer’s Checklist for Formatting Checklists, this audit evaluated how the checklist’s layout, language, and structure facilitate rapid comprehension and correct application during a chaotic incident. By assessing criteria such as active voice, precise terminology, and logical flow, this audit verifies the checklist is engineered to reduce user error when stakes are highest.

2.3 Legibility Under Stress: NASA’s Typography of Flight-Deck Documentation

The final layer of the theoretical audit applied standards from NASA’s research, "On the Typography of Flight-Deck Documentation." This choice was justified by the direct parallel between the high-stakes, high-stress environments of aviation flight decks and emergency incident command, where absolute clarity is paramount. This audit examined elements such as font choice, character spacing, and contrast to ensure the checklist remains legible and unambiguous under adverse operational conditions.

Each component of this validation framework provides objective evidence of the checklist’s adherence to best practices, which are then assessed through empirical compliance audits.

3.0 Guideline Compliance Audit: A Quantitative Assessment

This section presents the quantitative outcomes of measuring the Active Shooter Incident Management Checklist against the three expert guidelines. The results provide objective evidence of the checklist’s design quality and its adherence to established best practices in development, formatting, and typography.

3.1 Stufflebeam Development Process Compliance (28 January 2014)

The audit against Stufflebeam’s guidelines focused on the procedural rigor of the checklist’s creation, with findings summarized below. The review yielded an overall compliance score of 91.67% “Yes” across 36 applicable criteria, with zero items rated “No,” indicating a robust and systematic development process.

The three ‘Partial’ compliance items represent minor procedural variations, such as using electronic documents instead of physical 4x6 cards for sorting. These deviations are procedural nuances that do not detract from the substantive quality or the integrity of the development methodology.

3.2 Bichelmeyer Formatting Compliance (27 January 2014)

The audit against Bichelmeyer’s formatting guidelines assessed the checklist’s user-centric design. The review resulted in a 91.30% “Yes” compliance rate across 46 criteria, demonstrating a strong focus on cognitive ergonomics.

The noted deviations are not design flaws but rather deliberate trade-offs that prioritize operational realism and clarity, ultimately enhancing the tool’s practical usability.

3.3 NASA Typography Compliance (29 January 2014)

The audit against NASA’s typography standards confirmed the checklist’s design for high-stress legibility, achieving an exceptional 95.24% “Yes” compliance rate across 21 applicable criteria.

Key design compliance was confirmed, leveraging principles such as:

  • Font: Use of a sans-serif font (Gill Sans) to improve character recognition.
  • Case: Predominantly lower-case text to enhance readability speed under duress.
  • Spacing: Vertical and horizontal spacing meets or exceeds recommendations, preventing character crowding.
  • Contrast: Employs high-contrast black characters on a white background for maximum clarity.

The single ‘Partial’ compliance item related to the “x” height of some fonts. While the overall font height met NASA standards, the “x” height (the height of a lowercase ‘x’) for some characters was slightly below the 0.10-inch recommendation, a minor deviation within an otherwise highly compliant typographical design.

This high degree of theoretical compliance establishes the checklist’s robust design foundation; the following section details how this foundation translated into effective performance and user acceptance in high-fidelity field environments.

4.0 Field Validation: End-User Performance and Feedback

While compliance audits confirm theoretical soundness, only performance-based feedback from the target audience can validate a tool’s practical utility under realistic, high-pressure conditions. The Active Shooter Incident Management Checklist was subjected to two major field validation events, one in 2014 and a revalidation in 2017, to gather direct feedback from first responders.

4.1 Validation Methodology

A consistent methodology was employed for both validation exercises. Participants used the checklist during full-scale/functional hybrid training scenarios simulating the complexity of an actual active shooter event. To ensure broad experience, participants rotated through different incident command roles. Following the exercises, feedback was collected via survey instruments and post-scenario “hotwash” discussions.

4.2 Participant Demographics and Scenario Complexity

The validation events engaged a diverse group of first responders in scenarios of increasing complexity, ensuring the checklist was tested across a wide range of conditions and user types.

4.3 Quantitative Feedback Analysis (2014 vs. 2017)

User feedback was overwhelmingly positive in both validation events, with improvements in the 2017 revalidation demonstrating the success of iterative design refinements.

Yes/No Question Results (% “Yes” Responses)

The data shows consistently high ratings on the checklist’s core functions. The significant 10-point improvement in the clarity of its terminology from 87% in 2014 to 97% in 2017 is a direct testament to the efficacy of the iterative development and review cycle validated by the Stufflebeam audit and the focus on precise language validated by the Bichelmeyer audit.

Likert Scale Agreement Analysis (% Strongly Agree + Agree)

Participants consistently agreed that the checklist was well-organized, easy to use, and effective. The slight decrease in agreement for “Kept me on track” (94% to 90%) and “Improved my situational awareness” (91% to 88%) in 2017, despite overall higher ratings, may correlate with the significantly increased scenario complexity (up to 5 attackers and 150+ victims vs. 1-2 attackers and 80 victims). This suggests that while the checklist remained a highly effective tool, the cognitive load imposed by more complex scenarios may have slightly attenuated its perceived impact on these specific metrics.

The overwhelmingly positive user feedback from these rigorous field exercises confirms the checklist’s real-world effectiveness.

5.0 Conclusion: A Verified and Validated Job Aid

The Active Shooter Incident Management Checklist has been subjected to a rigorous, multi-pronged validation process that combines theoretical compliance with practical, performance-based evaluation. The evidence gathered from this comprehensive review leads to a clear and confident conclusion regarding its validity.

The key findings demonstrate that:

  • The checklist shows a high degree of compliance with expert-derived guidelines for development (Stufflebeam), ensuring its content is methodologically sound and comprehensive.
  • The checklist adheres closely to best practices for formatting (Bichelmeyer) and typography (NASA), confirming it is designed for ease of use, rapid comprehension, and legibility under stress.
  • Field validation exercises in both 2014 and 2017 with a diverse range of first responders yielded overwhelmingly positive feedback on the checklist’s content, clarity, usability, and effectiveness under pressure.

Based on this body of evidence, the Active Shooter Incident Management Checklist is found to be a valid job aid with appropriate content, format, terminology, and usability for Active Shooter Event Response.

This conclusion, first established in the initial 2014 validation, was reaffirmed and finalized in the 2017 revalidation review, attested to by C3 Pathways CEO/Chief Consultant William Godfrey on February 23, 2018.

Frequently Asked Questions

Relying on a single validation method is insufficient because different methodologies uncover different types of flaws. A multi-faceted approach ensures robustness across usability, accuracy, and stress resilience.
The checklist was audited against NASA's Typography of Flight-Deck Documentation to ensure legibility under high-stress conditions, alongside Stufflebeam's CIPP model for evaluation.
Yes. It underwent two major field validation events (2014 and 2017) involving law enforcement, fire, and EMS agencies in scenarios of increasing complexity.
The 2017 scenarios were significantly more complex (up to 5 attackers and 150+ victims), yet user feedback showed improved ratings for clarity and terminology, validating the iterative design refinements.
Methodology Review: Validation of the Active Shooter Incident Management Checklist

Get the full report

PDF Format
Download PDF

Conducted By

W
William Godfrey
CEO, C3 Pathways | Lead Instructor, National Center for Integrated Emergency Response (NCIER®)
C
C3 Pathways & NCIER
Applied Research Division

Related Reading


Top

Find the Perfect Training Class For You