WHAT IS A BIASED COMPUTER SYSTEM?

In its most general sense, the term bias means simply “slant.” Given this
undifferentiated usage, at times the term is applied with relatively neutral
content. A grocery shopper, for example, can be “biased” by not buying
damaged fruit. At other times, the term bias is applied with significant
moral meaning. An employer, for example, can be “biased” by refusing to
hire minorities. In this article we focus on instances of the latter, for if one
wants to develop criteria for judging the quality of systems in use—which
we do—then criteria must be delineated in ways that speak robustly yet
precisely to relevant social matters. Focusing on bias of moral import does
just that.
Accordingly, we use the term bias to refer to computer systems that
systematically and unfairly discriminate against certain individuals or
groups of individuals in favor of others. A system discriminates unfairly if
it denies an opportunity or a good or if it assigns an undesirable outcome to
an individual or group of individuals on grounds that are unreasonable or
inappropriate. Consider, for example, an automated credit advisor that
assists in the decision of whether or not to extend credit to a particular
applicant. If the advisor denies credit to individuals with consistently poor
payment records we do not judge the system to be biased because it is
reasonable and appropriate for a credit company to want to avoid extending
credit privileges to people who consistently do not pay their bills. In
contrast, a credit advisor that systematically assigns poor credit ratings to
individuals with ethnic surnames discriminates on grounds that are not
relevant to credit assessments and, hence, discriminates unfairly.
Two points follow. First, unfair discrimination alone does not give rise to
bias unless it occurs systematically. Consider again the automated credit
advisor. Imagine a random glitch in the system which changes in an
isolated case information in a copy of the credit record for an applicant who
happens to have an ethnic surname. The change in information causes a
downgrading of this applicant’s rating. While this applicant experiences
unfair discrimination resulting from this random glitch, the applicant could
have been anybody. In a repeat incident, the same applicant or others with
similar ethnicity would not be in a special position to be singled out. Thus,
while the system is prone to random error, it is not biased.
Second, systematic discrimination does not establish bias unless it is
joined with an unfair outcome. A case in point is the Persian Gulf War,
where United States Patriot missiles were used to detect and intercept
Iraqi Scud missiles. At least one software error identified during the war
contributed to systematically poor performance by the Patriots [Gao 1992].
Calculations used to predict the location of a Scud depended in complex
ways on the Patriots’ internal clock. The longer the Patriot’s continuous
running time, the greater the imprecision in the calculation. The deaths of
at least 28 Americans in Dhahran can be traced to this software error,
which systematically degraded the accuracy of Patriot missiles. While we
are not minimizing the serious consequence of this systematic computer
error, it falls outside of our analysis because it does not involve unfairness.