Circulation 36,825 • Volume 18, No. 1 • Spring 2003

Special Issue: Safety First: Ensuring Quality Care in the Intensely Productive Environment – The HRO Model

David M. Gaba, MD

A High Reliability Organization (HRO) repeatedly accomplishes its mission while avoiding catastrophic events, despite significant hazards, dynamic tasks, time constraints, and complex technologies. Examples include civilian and military aviation. We may improve patient safety by applying HRO concepts and strategies to the practice of anesthesiology.

Editor’s Note: Since 1986 The Anesthesia Patient Safety Foundation has made great strides in its mission to assure that no patient is harmed by anesthesia. Anesthesia is now safer than ever, but much remains to be done. The next challenge for patient safety may be to mirror high reliability organizations (HROs), which are organizations that operate under high stress, high tempo, “all weather” conditions with an extremely low error rate of less than 1:1,000,000 (which is also referred to as “six sigma reliability”). The airline industry (both civilian and military) recognizes that mere 99.99% reliability is unacceptable. Such reliability would predict 1 airline accident every 20 days at a regional airport with 500 flights per day. Anesthesia, like aviation, operates in a high-tempo, high-stress, all-condition environment. This Special Issue of the APSF Newsletter will introduce critical concepts of HRO theory and practice that contain vitally important lessons for health care in general, and anesthetic practice, in particular.

The public expects that health care, like a number of other human endeavors, should be a high reliability undertaking with little risk of preventable harm. Years ago, when medicine had few therapies to offer patients, the risks of harm due to care were low. In the current era, and increasingly over time, the processes of health care have become intrinsically hazardous, wielding powerful technologies and medications. If used improperly or in error they have the power to injure, maim, or kill. As detailed by Professor Roberts in this issue of the Newsletter (see page 13), other industries have come to grips with the routine conduct of other activities that bear intrinsic hazard. At their best, organizations in these domains have been described as High Reliability Organizations (HROs). As described by Professor Roberts, the theory of HROs has evolved over nearly 20 years.

Many of these industries share key features with health care that make them useful, if approximate models. These include the following:

  • Intrinsic hazards are always present
  • Continuous operations, 24 hours a day, 7 days a week, are the norm
  • There is extensive decentralization
  • Operations involve complex and dynamic work
  • Multiple personnel from different backgrounds work together in complex units and teams

Operators on the flight deck of an aircraft carrier are a classic HRO model.

Even more so than other industries, health care, including the perioperative setting, is not yet perfectly reliable and safe. It is difficult to measure precisely the rate of injury or death from the processes of care, and there has been considerable debate, particularly about the anesthesia-related death or injury rate. There is wide agreement that death or serious injury is a very rare event for healthy patients having routine anesthesia and surgery in accredited facilities. Yet even this very low death rate per operation is an order of magnitude higher than the fatal accident rate per departure of airliners (see Rates of apparently preventable injury and death are much higher for sicker patients or those having more complex surgery.

Of course, perioperative health care is not the same as generating electricity, flying airliners, or building a space station. For one thing, unlike nuclear power plants or airplanes, we do not design or manufacture human beings, nor do we receive an official instruction manual. Also, unlike some other industries in which the activities are relatively “elective” (the airliner doesn’t have to fly from New York to Chicago tonight), in health care there are situations in which procedures must be performed even if the hazards are particularly high. Nonetheless, perioperative health care still has much to accomplish to qualify as a high reliability undertaking, and we have much to learn from HRO theory.

What Specific Lessons Can We Learn From HRO Theory?

Although HRO Theory (HROT) is now a complex amalgam of approaches and viewpoints, some of the most relevant elements of HROT applicable to perioperative health care are discussed below. In general terms, HROT states that appropriate organizational control can yield nearly failure free results despite high hazard and high tempo, if the organization (and overall industry) embodies characteristics listed in Table 1. In the remainder of this paper these characteristics are discussed more fully as they apply to perioperative health care.

Table 1. Key Elements of a High Reliability Organization
  • Systems, structures, and procedures conducive to safety and reliability are in place.
  • Intensive training of personnel and teams takes place during routine operations, drills, and simulations.
  • Safety and reliability are examined prospectively for all the organization’s activities; organizational learning by retrospective analysis of accidents and incidents is aggressively pursued.
  • A culture of safety permeates the organization.

Increased Standardization Needed

The structures of health care’s institutions and organizations may not be well-suited to achieve optimum safety or reliability. The industry is massively decentralized and poorly integrated. For example, while commercial aviation has thousands of departures per day, seemingly similar to the number of anesthetics and surgical procedures, the industry is structured very differently. Only 10 organizations (airlines) handle more than 95% of passenger miles, whereas in health care there are on the order of 6,000 hospitals owned probably by more than 1,000 firms.

There is little standardization—each organization, indeed, each work unit or each clinician has a completely different approach to common problems. Excessive or slavish reliance on standard operating procedures can be detrimental to reliability and safety. Health care of human beings clearly requires more flexibility than does flying highly uniform airliners. Still, the pendulum in health care is currently so far from the use of standard procedures that we probably have much to gain by increasing such standardization. Indeed the whole thrust of evidence-based medicine is to disseminate and standardize proven best practices.

There are few economies of scale, and little integration between facilities, or between firms. Financial incentives are not aligned fully with goals of reliability and safety. The public plays a role in this misalignment, as people largely wish (irrationally) to save money on everyone else’s health care, but not on their own or their family’s. (This attitude is a corollary to “Wildavsky’s Law of Medical Money.”) In HROs safety and reliability are accepted as properties of the system, which needs to be resilient in the face of individual failures. Appropriate redundancies in equipment and procedures along with safety-oriented teamwork are two of the ways this can be accomplished.

Cognition, Teamwork, and Training Are Crucial

The emphasis on systems does not mean that people and their skills are unimportant. Systems are made up of people working in organizations. The dynamism, complexity, and risk in industries of high intrinsic hazard require special attention to decision making processes of individual personnel and the teams in which they work. In fact, it appears that well-functioning clinical “microsystems” within health care institutions may be the best current examples of HROs in health care. Such HRO-like microsystems are still found only sporadically.

The literature pertaining to cognition in dynamic team-based environments is already large and growing rapidly. Nonetheless, a few key points can be summarized here.

  • One cannot assume that individual workers are “interchangeable parts” who can be automatically slotted into teams based on their formal credentials with equivalent results. Health care often makes this assumption in staffing. To the extent that HROs have been successful in doing this (in commercial aviation it is common for flight crew members to have never flown with each other before), it is because of the detailed structuring of work procedures and extensive training of crew members in techniques of team-building and working together. Health care workers need intensive training in teamwork as well as high-intensity training of their “crews” and “teams.” More information on training will follow.
  • There may also be benefits from having teams that regularly work and train together. There is conjecture that these kinds of organizational features are behind results suggesting that centers that perform a high-volume of specific surgical procedures have (on average) better outcomes than those that do not. The high volume favors the creation of dedicated organizations and teams, which may be the true explanation for the difference.
  • Work units in HROs “flatten the hierarchy” when it comes to safety-related information. Hierarchy effects can degrade the apparent redundancy offered by multi-person teams. One factor is called “social shirking”—assuming that someone else is already doing the job. Another factor is called “cue giving and cue taking”—personnel lower in the hierarchy do not act independently because they take their cues from the decisions and behaviors of higher-status individuals, regardless of the facts as they see them. A recent case illustrating some of these pitfalls is the sinking of the Japanese fishing boat Ehime Maru by the US submarine USS Greeneville (ironically, typically a genuine high reliability organization). Hierarchy effects can be mitigated by procedures and cultural norms that ensure the dissemination of critical information regardless of rank or the possibility of being wrong.

Simulation Ensures Readiness

HROs recognize that intensive training and performance assessment in both routine work and simulation and drills pays off. HROs ensure that teams and work units hone their skills during routine operations. They debrief themselves routinely and keep track of individual and team performance. For example, every Naval aviator, regardless of experience level or seniority, is graded on every carrier landing. HROs also use simulation and drills extensively to ensure maximum readiness for critical but uncommon situations and to optimize team performance. Training is built into the work environment—it is not an add-on for the individual. The emphasis is on training the system, not just individuals. Moreover, training continues for the entirety of one’s career and is not limited to those learning the job.

In health care the nursing profession adheres to these principles more avidly than do physicians. Physicians rely on a fairly weak and haphazard system of “continuing medical education” to maintain individual abilities. This system is largely at the discretion of individuals and on their own time and expense. There is little systematic training of teams.

Simulation-based training for individuals, crews, and teams in health care is just beginning. This is one of the areas in which health care can most readily adopt lessons from HROs. The APSF was a leader in bringing simulation training to health care. Already there are numerous simulation centers in which such curricula could be applied to perioperative health care teams.

Organizational Learning Helps to Embed Lessons

Simulation training and drills are important tools for making the OR function as an HRO.

HROs aggressively pursue organizational learning about improving safety and reliability. They analyze threats and opportunities in advance. When new programs or activities are proposed they conduct special analyses of the safety implications of such programs, rather than waiting to analyze the problems that occur. Even so, problems will occur and HROs study incidents and accidents aggressively to learn critical lessons. Most importantly, HROs do not rely on individual learning of these lessons. They change the structure or procedures of the organization so that the lessons become embedded in the work.

Unlike many HROs, health care still has very weak systems of detecting, assessing, and responding to incidents and adverse events. These are local and isolated and cannot readily share lessons learned. They typically focus on individuals not on systems, and they do not often yield practical and effective changes in work practices.

Root cause analysis and failure modes and effects analysis are techniques used by HROs that are being introduced into health care. Considerable attention has been devoted to “event reporting systems.” There are also projects to upgrade traditional “morbidity and mortality conferences” to become more systems oriented. These are important beginnings. Nonetheless, to date, there has been too much attention to issues of “reporting” incidents and accidents, and too little attention to the organizational learning and changes in work practices that should come from the analysis of these reports.

Safety Culture Can Offset Production Pressure

What really makes an HRO is the way that its people behave. The norms and routines of behavior based on shared values and expectations is “culture” (see Editor’s Note, at end). HROs create and maintain cultures in which safety and reliability are the most important values and norms. In this issue, Sara Singer discusses safety culture in more detail, and how health care stacks up relative to acknowledged HROs in terms of culture (see “Survey”).

In health care a prime challenge to a culture of safety is “production pressure,” the overt or covert pressure to put production (as in throughput and efficiency) ahead of safety, despite lip service to the contrary. This has been shown to be a real problem in the operating room. All endeavors have pressure to work quickly, efficiently, and with few delays. There is an intrinsic asymmetry between production and safety such that signals about production are inherently stronger than signals about safety. Yet HROs have established mechanisms to ensure that these pressures do not overcome real safety concerns. They have evolved formal mission rules, checklists, milestones, and a system whereby even the most junior person feels empowered and obligated to halt production for a credible safety threat. Developing such mechanisms will be important for health care.


No organization is a perfect High Reliability Organization. Nor is health care completely devoid of structures and procedures like those of HROs. Yet medicine, which is pledged to “first do no harm,” is far behind other industries in adopting the organizational principles needed to strive for high reliability and safety. Many of the HRO lessons will be difficult to adopt widely. Some of the principles challenge deeply embedded aspects of the current system. Most will require the expenditure of resources to yield change and improvement. One thing is certain, however. It will likely be impossible to achieve the nearly error-free results expected by the public without heeding the lessons of High Reliability Organization Theory. Attempting to do so will just be putting Band-aids on gaping wounds.

Over the next year the APSF intends to begin a new initiative in High Reliability Perioperative Health Care. Our goal will be to help anesthesiologists and others in the perioperative setting to implement as fully as possible the lessons that can be extracted from HRO theory. Some programs may be targeted at anesthesia personnel alone. Many will need to be collaborative efforts with our surgical and nursing colleagues. We will seek to establish formal institutional links with the safety-oriented components of professional societies and organizations representing the entire spectrum of perioperative health care.

Dr. Gaba is Director of the Patient Safety Center of Inquiry at the VA Palo Alto Health Care System and Professor of Anesthesia at Stanford University School of Medicine.

Suggested Reading/General References

  1. Committee on Quality of Health Care in America. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academy Press, 2001.
  2. Cooper J, Gaba D. No myth: anesthesia is a model for addressing patient safety (editorial). Anesthesiology 2002;97:1335-1337.
  3. Gaba DM. Structural and organizational issues in patient safety: a comparison of health care to other high-hazard industries. California Management Review 2001;43:83-102.
  4. Gaba D, Howard S, Jump B. Production pressure in the work environment: California anesthesiologists’ attitudes and experiences. Anesthesiology 1994;81:488-500.
  5. Heimann C. Acceptable risks: politics, policy, and risky technologies. Ann Arbor, MI: The University of Michigan Press, 1997.
  6. Helmreich R, Foushee H. Why crew resource management? In: Weiner E, Kanki B, Helmreich R, eds. Cockpit Resource Management. San Diego, CA: Academic Press, 1993:3-46.
  7. Helmreich R, Merritt A. Culture at work in aviation and medicine. Aldershot, UK: Ashgate Publishing Limited, 1998.
  8. Lagasse RS. Anesthesia safety: model or myth? A review of the published literature and analysis of current original data. Anesthesiology 2002;97:1609-1617.
  9. Mohr J, Batalden P. Improving safety on the front lines: the role of clinical microsystems. Quality Safe Health Care 2002;11:45-50.
  10. Sexton JB, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ 2000;320:745-749.
  11. Rochlin G, La Porte T, Roberts K. The self-designing high reliability organization: aircraft carrier flight operations at sea. Naval War College Review 1987;42 (Autumn):76-90.
  12. Roberts K. New challenges in organizational research: high reliability organizations. Industrial Crisis Quarterly 1989;3:111-125.
  13. Roberts K. Some characteristics of high reliability organizations. Organization Science 1990;1:160-177.
  14. Roberts K, Rousseau D, La Porte T. The culture of high reliability: quantitative and qualitative assessment aboard nuclear powered aircraft carriers. Journal of High Technology Management Research 1994;5:141-161.
  15. Roberts KH, Tadmor CT. Lessons learned from non-medical industries: the tragedy of the USS Greeneville. Qual Saf Health Care 2002;11:355-357.
  16. Wildavsky A. Doing better and feeling worse: the political pathology of health policy. Daedalus 1977:105-23

Editor’s Note: This Issue makes reference to the term Safety Culture. Experts in high reliability organization theory and practice have also introduced the term Safety Climate. While there are different implications of the two terms, for simplicity we have chosen Safety Culture to embody both concepts.