Circulation 36,825 • Volume 18, No. 1 • Spring 2003

HRO Has Prominent History

Karlene H. Roberts, PhD

Research into and management of organizational errors has its social science roots in human factors, psychology, and sociology. The human factors movement began during World War II and was aimed at both improving equipment design and maximizing human effectiveness. In psychology, Barry Turner’s seminal book, Man-Made Disasters, pointed out that until 1978 the only interest in disasters was in the response (as opposed to the precursor) to them. Turner identified a number of sequences of events associated with the development of disaster, the most important of which is incubation—disasters do not happen overnight. He also directed attention to processes, other than simple human error, that contribute to disaster. A sociological approach to the study of error was also coming alive. In the United States just after WW II some sociologists were interested in the social impacts of disasters. The many consistent themes in the publications of these researchers include the myths of disaster behavior, the social nature of disaster, adaptation of community structure in the emergency period, dimensions of emergency planning, and differences among social situations that are conventionally considered as disasters.1

In his well-known book, Normal Accidents, Charles Perrow concluded that in highly complex organizations in which processes are tightly coupled, catastrophic accidents are bound to happen. Two other sociologists, James Short and Lee Clarke,2 call for a focus on organizational and institutional contexts of risk because hazards and their attendant risks are conceptualized, identified, measured, and managed in these entities. They focus on risk-related decisions, which are “often embedded in organizational and institutional self-interest, messy inter- and intra-organizational relationships, economically and politically motivated rationalization, personal experience, and rule of thumb considerations that defy the neat, technically sophisticated, and ideologically neutral portrayal of risk analysis as solely a scientific enterprise (p. 8).” The realization that major errors, or the accretion of small errors into major errors, usually are not the results of the actions of any one individual was now too obvious to ignore.

High Reliability Organizations

This set the stage in 1984 for a research group at the University of California at Berkeley to begin to study organizations in which errors can have catastrophic consequences. They focused initially on organizations that seemed to behave very reliably, which they called high reliability organizations (HROs). Another group at the University of Michigan began addressing similar issues (e.g., Weick, 1987; Weick, Sutcliffe, and Obsfeldt, 1999). While these people represented different disciplines (psychology, political science, physics), they came together with an organizational perspective. These researchers took a different perspective than most of those who preceded them. They were initially concerned with understanding success in organizations in which errors can result in catastrophe.

The Berkeley group’s initial work was done in the Federal Aviation Administration’s Air Traffic Control Center, in a commercial nuclear power plant, and aboard the U.S. NavyÕs aircraft carriers. This group produced a number of findings. Among them are the following.

Organizations that must be successful all of the time continually reinvent themselves.3-5 For example, when community emergency incident command systems realize what they thought was a garage fire has now changed into a hazardous material incident they completely restructure the response organization. They also flexibly improvise on existing structure.5,6 So an aircraft carrier uses its functional units slightly differently depending on whether they are on a humanitarian mission, a search and rescue mission, or are engaged in night flight operations training.

In these systems decision-making migrates down to the lowest level consistent with decision implementation.7 The lowest level people aboard U.S. Navy ships make decisions and contribute to decisions. The U.S.S. Greenville hit a Japanese fishing boat in part because this mechanism failed. The sonar operator and flight control technician did not question their commanding officer’s activities. Their job descriptions require that they do. Cultures of reliability are difficult to develop and maintain8,9 as was evident aboard the Greenville, where in a matter of hours the culture went from an HRO to a LRO (low reliability organization).

Finally, systems of organizations operate together to produce risk enhancing or risk mitigating outcomes.10 For a U.S. Naval battle group to behave reliably requires that all system members act in concert, openly sharing communication, reducing status differentials at sea, and letting people with the salient information and training make decisions. The carrier and its aircraft squadrons have to operate in concert with the battle group’s submarine, frigate, destroyer, and cruiser complement.

Based on her investigation of 5 commercial banks, Carolyn Libuser11 developed a management model that includes 5 processes she thinks are imperative if an organization is to maximize its reliability. They are:

  1. Process auditing. An established system for ongoing checks and balances designed to spot expected as well as unexpected safety problems. Safety drills and equipment testing are included. Follow-ups on problems revealed in previous audits are critical.
  2. Appropriate Reward Systems. The payoff an individual or organization realizes for behaving one way or another. Rewards have powerful influences on individual, organizational, and inter-organizational behavior.
  3. Avoiding Quality Degradation. Comparing the quality of the system to a referent generally regarded as the standard for quality in the industry and insuring similar quality.
  4. Risk Perception. This includes two elements: a) whether there is knowledge that risk exists, and b) if there is knowledge that risk exists, acknowledging it, and taking appropriate steps to mitigate or minimize it.
  5. Command and Control. This includes 5 processes: a) decision migration to the person with the most expertise to make the decision, b) redundancy in people and/or hardware, c) senior managers who see “the big picture,” d) formal rules and procedures, and e) training-training-training.

The Michigan group primarily extended on Karl Weick’s notions of sense making (e.g., Weick, 1990; Weick, 1993;3; Weick Sutcliffe and Obsfeldt, 1999). These authors point to the importance of various people in the organization correctly perceiving the events before them and artfully tying them together to produce a “big picture” that includes processes through which error is avoided. A representation of the knowledge available might be a Venn diagram or a hologram in which no one has the whole story but different individuals have important parts of the story that then are tied together to represent the whole.

HROs in Health Care

HRO organizational processes sneaked into the health care arena, in both discussion and implementation. In 1989 the constructs from Libuser’s model were successfully used in revitalizing the pediatric intensive care unit at Loma Linda University’s Children’s Hospital (LLUCH). LLUCH has 250 beds and is the tertiary children’s hospital for a geographic area more than 3 times the size of the state of Vermont. The population served is 2.5 million people with 500,000 under the age of 15 years.

Discussions about whether HRO concepts have value in reducing patient error were introduced into the medical literature in 1994.12 Such discussions continued in 1996 with the formation of the National Patient Safety Foundation of the American Medical Association. A few years later NPSF supported a research project that represented a direct extension of the HRO findings in a health care setting. At about the same time Libuser’s model was used to develop a short questionnaire for obstetric nurses at the University of Minnesota.

The Patient Safety Center of Inquiry at VA Palo Alto Health Care System and Stanford University utilize HRO concepts extensively in their application of theories of organizational safety to health care in their simulation-based teamwork-oriented training, and in their programs aimed at measuring and intervening in hospital safety cultures. Some of these activities are discussed elsewhere in this issue.

Have Other Industries Adopted HRO Principles?

Drill, practice, and simulation prepare flight deck crews to deal with critical incidents.

Commercial aviation was the first to develop HRO-like principles after a deadly United Airline accident in Portland in 1978. The research effort underlying this was small group research on team building. Virtually all commercial airline companies have some form of crew resource management (CRM), which is a training program that focuses on flight crew communication, decision-making, and the like. It argues for the importance of reducing status differentials within the flight crew. Several airlines have extended CRM to in-flight crews and baggage handlers. U.S. military aviation has been slower to adopt CRM. After a series of deadly F-14 Tomcat accidents in 1996, the U.S. Navy developed its Command Safety Assessment Survey and began using it in safety stand-downs to assess safety culture. The Navy has since extended its use to aviation mechanical crews, and Marine ground troops. The U.S. Coast Guard borrowed HRO principles in the development of its “Prevention through People” program in the 1990s.That program is monitored by a “champions group,” a large group of outsiders who assess the program periodically and argue for its continuation.

In response to the transfer to euros and Y2K, the Society for Worldwide Interbank Financial Telecommunication (S.W.I.F.T.) designed its “Failure is not an Option” program using HRO principles. S.W.I.F.T. is the consortium of financial institutions that moves 97% of the world’s money.

HRO constructs were applied to failing grammar schools in Wales. In the course of a year, application of many of the simple HRO constructs resulted in improved student performance. The children in those schools are now performing on par with the children in more successful schools.

In 2001, BP (British Petroleum) developed a program to improve refinery availability using HRO constructs. More recently Los Alamos National Laboratory is developing a way to assess itself relative to HRO concepts.

Finally, the health care industry is amassing more complex organizations and more difficult and complex activities. As Robert Pool states: “In a generation or two, the world will likely need thousands of high-reliability organizations running not just nuclear power plants, space flights, and air traffic control, but also chemical plants, electrical grids, computer and telecommunication networks, financial networks, genetic engineering, nuclear-waste storage, and many other complex, hazardous technologies. Our ability to manage a technology, rather than our ability to conceive and build it, may be the limiting factor in many cases.”13

Dr. Karlene H. Roberts is a Professor in the Walter A. Haas School of Business at the University of California, Berkeley.

References

  1. Dynes RR, Tierney KJ, Fritz CE. Forward: The emergence and important of social organization: the contributions of E.L. Quarantelli. In: Dynes RR, Tierney KJ. Disasters, Collective Behavior, and Social Organization. Newark, Delaware: University of Delaware Press, 1994.
  2. Short JF, Clarke L. Organizations, Uncertainties, and Risk. San Francisco: Westview, 1992.
  3. Schulman P. The analysis of high reliability organizations. In: Roberts KH, ed. New Challenges to Understanding Organizations. New York: Macmillan, 1993: 33-54.
  4. La Porte TR, Consolini P. Working in theory but not in practice: theoretical challenges in high reliability organizations. Journal of Public Administration Research and Theory 1991;1:19-47.
  5. Bigley, GA, Roberts KH. Structuring temporary systems for high reliability. Academy of Management Journal 2001;44:1281-1300.
  6. Weick KE, Roberts KH. Collective mind and organizational reliability: the case of flight operations on an aircraft carrier deck. Administrative Science Quarterly 1993;38:357-381.
  7. Roberts KH, Stout SK, Halpern JJ. Decision dynamics in two high reliability military organizations. Management Science 1994;40:614-624.
  8. Roberts KH, Rousseau DM, La Porte TR. The culture of high reliability: quantitative and qualitative assessment aboard nuclear powered aircraft carriers. Journal of High Technology Management Research 1994;5:141-161.
  9. Tadmor CT, Roberts KH. Structural failures and the development of an organizational breakdown: the tragedy of the USS Greeneville. Submitted.
  10. Grabowski M, Roberts KH. Risk mitigation in virtual organizations. Organization Science 1999;10:704-721. Also in Journal of Computer Mediated Communication 1998; 3, 4.
  11. Libuser CB. Organization Structure and Risk Mitigation. A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Management, University of California, Los Angeles, 1994.
  12. Gaba D, Howard SK, Jump B. Production pressure in the work environment: California anesthesiologistsÕ attitudes and experiences. Anesthesiology 1994;81:488-500.
  13. Poole R. Beyond Engineering: How Society Shapes Technology. New York: Oxford University Press, 1997.