Human Error in Medicine

First things first: "medical error" is not "medicine" it is "error." The discipline appropriate to its study and diagnosis is not medicine but theory of error - errorology! What is involved is a part of the study of human behavior. That errors happen to occur in a medical setting is the reason for my speaking here. If the identical error happened to occur in a nuclear power plant I would be saying very similar things to a meeting of nuclear power specialists (with some variation in nomenclature of course). It has been repeatedly said, over thousands of years, that to err is part of being human. For example:

  1. Errare Humanum Est; to err is human. (Probably a variation on Plutarch, Morals, c 100 AD)

  2. "I presume you're mortal, and may err." (Shirley, The Lady of Pleasure, 1635)

  3. "To err is human; to forgive divine." (Pope, Essay on Criticism 1711)

  4. "To err is human; to forgive is against company policy." (Senders, various, 1978)

All of these state that errors will be made by people despite their determination to avoid them. Yet people are consistently held accountable for their errors when they lead to accidents: adverse outcomes.Is this proper? I argue that it is not, in the same way that in law no-one is held accountable for acts of God.

The Distinction Between Errors And Accidents

What is an error? From the external viewpoint, an error is a failure to perform an intended action which was correct given the circumstances. In my view an error can occur only if there was or should have been an appropriate intention to act on the basis of a perceived or a remembered state of events; and if the action finally taken was not that which was or should have been intended. An error is not defined by an adverse or serious outcome. An adverse outcome may occur with no error if the intention was the proper one, the action was properly executed, and the outcome was probabilistic in nature (as in playing a game, in deciding whether to carry an umbrella, or in administering a medication or performing an operation known to be risky.)

What is an accident? An accident is an unplanned, unexpected, and undesired event, usually with an adverse outcome. An adverse outcome after an error, by this definition, must be construed to be an accident. No one plans an error; no one expects an error; no one desires an error.

The Relation Between Errors and Accidents

An error is a psychological event with psychological causes if errors are caused at all (there is always the possibility that causes of all or some errors can not be identified). An error may have any of a (possibly large) number of causes. A defined causal mechanism can give rise to a taxonomy of errors. There are many possibly causal taxonomies of error.

Some Taxonomies of Errors

Internal Processes

  • Input Error or Misperception: the input data are incorrectly perceived, an incorrect intention is formed, and the wrong action is performed; that is, an action other than what would have been intended, had the input been correctly perceived. For example, I may be confronted by the phrase "1000 mg" and see it as "100.0 mg". I decide that it should be administered as a bolus into a Y-port and I successfully do so. A fatal overdose results.

  • Intention Error or Mistake: the input data are correctly perceived, an incorrect intention is formed, and the wrong action is performed; that is, an action other than what should have been intended given that the input was correctly perceived. For example, I may be confronted by the phrase "1000 mg" and see it as "1000 mg". I incorrectly decide that it should be administered as a bolus into a Y-port and I successfully do so. A fatal overdose results.

  • Execution Error or Slip: the input data are correctly perceived, the correct intention is formed, and the wrong action is performed; that is, an action not what was intended. For example, I may be confronted by the phrase "1000 mg" and see it as "1000 mg". I correctly decide that it should be administered as a drip after dilution in a drip bag. I become distracted while approaching the patient and, from habit, inject the contents as a bolus into a Y-port.. A fatal overdose results.

Attribution to Locations

  • Endongenous Error: an error which arise from processes inside the actor. The elimination or reduction of such errors must involve psychology, physiology or neurology. The error resulting from distraction cited above is endogenous. It probably results from the capture of a lower probability process (injection into a bag) by a higher probability process (injection into a Y-port), the two processes sharing common elements of action.

  • Exogenous Error: an error which arises from processes outside the actor. The elimination or reduction of such errors must involve engineering and design of objects and work environments.The error may arise from the occasional use of extraneous ".0" since this allows the false interpretation of "2000" as "200.0" . Better yet, the error could be reduced in probability by the spelling out of the amount as "TWO THOUSAND" since this will rarely, if ever, be read as TWO HUNDRED."


If an error results in an action, then there is a phenomenon which can be observed. The particular appearance of the error we may call its mode. One reasonable taxonomy of error modes is:

  • Error of Omission: an error characterized by the leaving out of an appropriate step in a process.

  • Error of Insertion: an error characterized by the adding of an inappropriate step to a process.

  • Error of Repetition: an error characterized by the inappropriate adding of a step normally appropriate to a process. (This may be reduced to an insertion of a step which would ordinarily be appropriate.)

  • Error of Substitution: an error characterized by an inappropriate object, action, place or time instead of the appropriate object, action, place or time. (This may be reduced to an omission of a step followed by an insertion of some other, and inappropriate, step)

How To Talk About Errors

It is common to discuss errors in medical settings in terms of their expressions, what was done wrong, or in terms of their consequences, what happened to the patient. We see, as a rule, only those consequences which result in injury or death. The number of errors involving Lidocaine, for example, is an interesting statistic but it does not tell us much about why the errors occurred. What we don't see are those errors which occurred and were caught before they were completed. We do not have a good estimate of the probability of substitution errors on the night shift, or by physicians, or by pharmacists. The mode of an error is a datum of great importance. It could, for example, help us in estimating the risk of the introduction of a new drug, a new package, or a new device into the hospital. The mode of an error will result in some kind of expression, i.e., something wrong will be done in the particular environment in which the action occurs. The expression must depend on what is available to be done in the environment. In a medical setting, an error of substitution (its mode) may be result in a nurse's picking up the a 2 gram prefilled Lidocaine syringe (its expression) instead of a 100 mg syringe. In a nuclear power plant the same error might be expressed by turning off the wrong pump. The error is the same in both cases: a wrong act has been substituted for the right act.

Finally there is a consequence, i.e., something will happen as a result of what was erroneously done. That is an "accident caused by an error." For example, the syringe substitution error can result in a massive overdose of Lidocaine.

What Can Be Done About Errors?

An error can occur, and can be self-detected and (sometimes) corrected, at many points in the sequence of mental events between a perception and the resulting action. An error can occur, and can be self-detected and (sometimes) corrected, at many points in the sequence of physical events between the beginning and the end of an action. Such error detection can be of the error's mode, of its expression, or of its consequence. For example, a nurse may start to reach for a 2 gram Lidocaine syringe and change the motion towards its correct goal, the 100 mg syringe. This correction might be a conscious act or not; little has been done on the analysis of incipient errors. A nurse may actually pick up the wrong syringe and replace it with the correct one, and so on. There are many opportunities for self detection. Personal experience tells us that the probability of such self-corrected errors will be high, and that such errors are common. Some recent data gathered by Senders and Cohen (1) suggest that for every error completed by nurses (e.g. the actual use of a wrong syringe) there are about ten which were caught before they actually were completed, so that there was no accident and, therefore, no report. Errors are much more likely than we might think on the basis of data from present reporting systems.

For the internal points in the sequence, the mental events, possible remedial actions must be psychological in nature. Sellen, for example, suggests the possibility of training in the use of tactics leading to improved probability of self-detection of error. For the external points in the sequence there must be systems analysis and redesign of elements of the system and the system as a whole. The goal of such redesign: of means of use, of packaging, of labeling, of warnings, is to make the object announce its identity to the user through many independent and redundant routes. (6) In an ideal world, a prescription would say in clear and unambiguous words what medication was to be given to which patient, when, how, and how much and so on. A medication container would tell the person holding it what its name is, what the appropriate dose is, how it should be used, and what the consequences will be if it is used in any of a variety of improper ways. And it would say all this in multiple ways, clearly and unambiguously, as if on the assumption that the person holding it was blind, stupid, and ignorant. It is an unfortunate fact that the whole process, from the writing of a prescription to the preparation of the dose to the administration of the medication, is shot full of problems stemming from appalling handwriting and ambiguous abbreviations, poorly designed packaging and non-standard labeling. The gruesome catalog of Medication Errors presented by Davis and Cohen (2) demonstrates the effect of the absence of a controlled vocabulary (3), coupled with an absolute and enforced rejection of deviations from it for everyone involved.

The Mental Act Of God (MAOG)

Should people, the actors, be blamed for their errors? Should they be held responsible? Blame implies a theory that incipient errors can be perceived by the actor before they are executed, and voluntarily controlled to prevent their execution. Responsibility implies a theory that consequences arise because of flaws in behavior. An "Act of God" is defined thus: "In law; a direct, sudden, and irresistible action of natural forces, such as could not humanly have been foreseen or prevented." (The American College Dictionary); and thus: "Operation of uncontrollable natural forces." (Concise Oxford Dictionary)

Errors, to the extent that we have data, are random; the moment when an error will occur cannot be predicted. There is no "aura" which signals to an actor that an error is about to occur. From the point of view of the actor, the error that (s)he commits is an MAOG. The actor is the victim of the error; the patient is the victim of the expression of the error in a medical setting which permits the error to be completed and produce an injury.

When All Else Fails Use Failure Mode Analysis

Finally, one must accept the fact that various errors will occur and try to prevent or inhibit the translation of the error into an accident. The Food and Drug Administration and the manufacturers must stop expecting nurses and physicians to use things correctly every time. Each medication package and each medical device must be subjected to Failure Mode Analysis. In brief, one must ask:

  • What incorrect actions can people do?

  • What will be the result of the incorrect actions?

  • How can we prevent those actions from being completed?

This means that the possible ways in which each package or device can be misused should be exhaustively tabulated. Then the outcomes of each misuse are identified and evaluated. Those which are unacceptable must be designed out. That is, if a possibility is undesirable, then the possibility must be eliminated. This is like locking the barn door before the horses gallop out; the more usual approach is to use a running horse detector to lock the doors. For example, the IV system with tubing, bags, Y-ports, and the like can be designed so that a prefilled syringe which is supposed to be injected into a LVP and used as a drip in dilute form must be incapable of being injected into the arm of a Y-port. This requires, of course, meticulous analysis of even such simple systems as a prefilled syringe or the IV set, and the enforcing of standards industry-wide to provide assurance of incompatibility where it is necessary.


There are few or no "Medical Errors;" There are many errors which occur in medical settings. Those that are not prevented from running their courses lead to accidents. These are the ones which come to our attention.

Many of these errors stem from the absence of a controlled and mandatory vocabulary for use in the medical setting. Such errors could be eliminated.

Since not all errors can be prevented, it is necessary to reduce the consequences of the expression of error in the medical setting. Failure mode analysis is the appropriate tool.

Information must me gathered on those errors that do not lead to reportable events. Such data are needed if we are to predict what kinds of incorrect things are likely to be done in the future.

Whether one knows the causes of some errors, whether one can prevent some errors. are not the ultimate issues. The ultimate issues are:

  • The identification of the modes of errors in medical settings

  • The prediction of the expression of those errors

  • The use of training and design to improve self detection

  • The interdiction of their transformation into accidents

References And Recommended Readings

1) Senders, J. and M. Cohen, Near Misses and Real Accidents, Unpublished survey of nurses' errors, 1992 A questionnaire relating to recalled medication accidents and near misses was filled out by a small sample of nurses. Of 25 who responded one way or the other to the relevant question, only 1 recalled having actually administered an incorrect syringe to a patient; 10 recalled having almost done so but detected the error before it had been translated into an accident. More data of this kind will be collected and analyzed.

2) Davis, N. and M. Cohen, Medication Errors: causes and prevention, N.M. Davis Assoc., Huntingdon Valley, PA, 1983. This is a catalog of 'errors' submitted over the years to the journal Hospital Pharmacy, and a discussion and analysis of the various kinds which occur in medical settings. It is a must for anyone involved in the prescription, preparation, and administration of medications.

3).Davis, N., Medical Abbreviations: 7000 conveniences at the expense of communications and safety, N.M. Davis Assoc., Huntingdon Valley, PA, 1988. The title says it all. It is simultaneously a source of information about what various abbreviations (may) mean, and an admonition never to use them.

4) Senders, J. and N. Moray, Human Error:cause, prediction, and reduction, Lawrence Erlbaum Associates, Hillsdale, NJ 1991. This volume is a synthesis of the thoughts of 22 scientists in various fields about the nature and source of human error. Its purpose is to stimulate thought and research on the topic.

5) Reason, J., Human Error, Cambridge Univ. Press, 1990. This is a mixture of theory and practicality. Reason surveys the work of others in the field and draws from his own experiments to arrive at a comprehensive overview of human error for behavioral scientists and for those interested in application to real world problems.

6) Sellen, A., Mechanisms of Human Error and Human Error Detection; unpublished PhD Dissertation, Univ.of California at San Diego, 1990. Available from Dissertation Abstracts, Ann Arbor, MI

Copyright © 2003 Marc Green, Ph.D., All rights reserved. No portion of this article may be reproduced without the express written permission of the copyright holder. If you use a quotation, excerpt or paraphrase of this article, except as otherwise authorized in writing by the author of the article you must cite this article as a source for your work and include a link back to the original article from any online materials that incorporate or are derived from the content of this article.

This article was last reviewed or amended on Nov 8, 2014.