This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
A system accident (or normal accident) is an "unanticipated interaction of multiple failures" in a complex system.[1] This complexity can either be of technology or of human organizations and is frequently both. A system accident can be easy to see in hindsight, but extremely difficult in foresight because there are simply too many action pathways to seriously consider all of them. Charles Perrow first developed these ideas in the mid-1980s.[2] Safety systems themselves are sometimes the added complexity which leads to this type of accident.[3]
Pilot and author William Langewiesche used Perrow's concept in his analysis of the factors at play in a 1996 aviation disaster. He wrote in The Atlantic in 1998: "the control and operation of some of the riskiest technologies require organizations so complex that serious failures are virtually guaranteed to occur."[4][a]
langew atlantic
was invoked but never defined (see the help page).
Cite error: There are <ref group=lower-alpha>
tags or {{efn}}
templates on this page, but the references will not show without a {{reflist|group=lower-alpha}}
template or {{notelist}}
template (see the help page).