scispace - formally typeset
Open AccessJournal ArticleDOI

Trust in Automation: Designing for Appropriate Reliance

John D. Lee, +1 more
- 01 Mar 2004 - 
- Vol. 46, Iss: 1, pp 50-80
TLDR
This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives, and considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust.
Abstract
Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

read more

Content maybe subject to copyright    Report

INTRODUCTION
Sophisticated automation is becoming ubiq-
uitous, appearing in work environments as di-
verse as aviation, maritime operations, process
control, motor vehicle operation, and informa-
tion retrieval. Automation is technology that
actively selects data, transforms information,
makes decisions, or controls processes. Such
technology exhibits tremendous potential to ex-
tend human performance and improve safety;
however, recent disasters indicate that it is not
uniformly beneficial. On the one hand, people
may trust automation even when it is not appro-
priate. Pilots, trusting the ability of the auto-
pilot, failed to intervene and take manual control
even as the autopilot crashed the Airbus A320
they were flying (Sparaco, 1995). In another in-
stance, an automated navigation system mal-
functioned and the crew failed to intervene,
allowing the Royal Majesty cruise ship to drift off
course for 24 hours before it ran aground (Lee &
Sanquist, 2000; National Transportation Safety
Board, 1997). On the other hand, people are not
always willing to put sufficient trust in automa-
tion. Some operators rejected automated con-
trollers in paper mills, undermining the potential
benefits of the automation (Zuboff, 1988). As
automation becomes more prevalent, poor part-
nerships between people and automation will
become increasingly costly and catastrophic.
Such flawed partnerships between automa-
tion and people can be described in terms of
misuse and disuse of automation (Parasuraman
& Riley, 1997). Misuse refers to the failures that
occur when people inadvertently violate critical
assumptions and rely on automation inappro-
priately, whereas disuse signifies failures that
occur when people reject the capabilities of
automation. Misuse and disuse are two exam-
ples of inappropriate reliance on automation
that can compromise safety and profitability.
Although this paper describes reliance on auto-
mation as a discrete process of engaging or dis-
engaging, automation can be a very complex
combination of many modes, and reliance is of-
tena more graded process. Automation reliance
is not a simple binary process, but the simplifica-
tion makes the discussion of misuse and disuse
more tractable. Understanding how to mitigate
Trust in Automation: Designing for Appropriate Reliance
John D. Lee and Katrina A. See, University of Iowa, Iowa City, Iowa
Automation is often problematic because people fail to rely upon it appropriately.
Because people respond to technology socially, trust influences reliance on automa-
tion. In particular, trust guides reliance when complexity and unanticipated situa-
tions make a complete understanding of the automation impractical. This review
considers trust from the organizational, sociological, interpersonal, psychological,
and neurological perspectives. It considers how the context, automation character-
istics, and cognitive processes affect the appropriateness of trust. The context in
which the automation is used influences automation performance and provides a
goal-oriented perspective to assess automation characteristics along a dimension
of attributional abstraction. These characteristics can influence trust through ana-
lytic, analogical, and affective processes. The challenges of extrapolating the concept
of trust in people to trust in automation are discussed. A conceptual model inte-
grates research regarding trust in automation and describes the dynamics of trust,
the role of context, and the influence of display characteristics. Actual or potential
applications of this research include improved designs of systems that require people
to manage imperfect automation.
Address correspondence to John D. Lee, 2130 Seamans Center, Department of Mechanical and Industrial Engineering,
University of Iowa, Iowa City, IA 52245; jdlee@engineering.uiowa.edu. HUMAN FACTORS, Vol. 46, No. 1, Spring 2004,
pp. 50–80. Copyright © 2004, Human Factors and Ergonomics Society. All rights reserved.

TRUST IN AUTOMATION 51
disuse and misuse of automation is a critically
important problem with broad ramifications.
Recent research suggests that misuse and dis-
use of automation may depend on certain feel-
ings and attitudes of users, such as trust. This is
particularly important as automation becomes
more complex and goes beyond a simple tool
with clearly defined and easily understood be-
haviors. In particular, many studies show that
humans respond socially to technology, and re-
actions to computers can be similar to reactions
to human collaborators (Reeves & Nass, 1996).
For example, the similarity-attraction hypothesis
in social psychology predicts that people with
similar personality characteristics will be attract-
ed to each other (Nass & Lee, 2001).
This finding also predicts user acceptance of
software (Nass & Lee, 2001; Nass, Moon, Fogg,
Reeves, & Dryer, 1995). Software that displays
personality characteristics similar to those of
the user tends to be more readily accepted. For
example, computers that use phrases such as
“You should definitely do this” will tend to ap-
peal to dominant users, whereas computers that
use less directive language, such as “Perhaps
you should do this,” tend to appeal to submis-
sive users (Nass & Lee). Similarly, the concept
of affective computing suggests that computers
that can sense and respond to users’ emotional
states may greatly improve human-computer
interaction (Picard, 1997). More recently, the
concept of computer etiquette suggests that
human-computer interactions can be enhanced
by recognizing how the social and work contexts
interact with the roles of the computer and
human to specify acceptable behavior (Miller,
2002). More generally, designs that consider
affect are likely to enhance productivity and
acceptance (Norman, Ortony, & Russell, 2003).
Together, this research suggests that the emotion-
al and attitudinal factors that influence human-
human relationships may also contribute to
human-automation relationships.
Trust, a social psychological concept, seems
particularly important for understanding human-
automation partnerships. Trust can be defined
as the attitude that an agent will help achieve an
individual’s goals in a situation characterized
by uncertainty and vulnerability. In this defini-
tion, an agent can be automation or another
person that actively interacts with the environ-
ment on behalf of the person. Considerable re-
search has shown the attitude of trust to be im-
portant in mediating how people rely on each
other (Deutsch, 1958, 1960; Rempel, Holmes,
& Zanna, 1985; Ross & LaCroix, 1996; Rotter,
1967). Sheridan (1975) and Sheridan and Hen-
nessy (1984) argued that just as trust mediates
relationships between people, it may also medi-
ate the relationship between people and auto-
mation. Many studies have demonstrated that
trust is a meaningful concept to describe human-
automation interaction in both naturalistic
(Zuboff, 1988) and laboratory settings (Halprin,
Johnson, & Thornburry, 1973; Lee & Moray,
1992; Lewandowsky, Mundy, & Tan, 2000; Muir,
1989; Muir & Moray, 1996). These observations
demonstrate that trust is an attitude toward au-
tomation that affects reliance and that it can be
measured consistently. People tend to rely on
automation they trust and tend to reject auto-
mation they do not. By guiding reliance, trust
helps to overcome the cognitive complexity peo-
ple face in managing increasingly sophisticated
automation.
Trust guides – but does not completely deter-
mine – reliance, and the recent surge in research
related to trust and reliance has produced many
confusing and seemingly conflicting findings.
Although many recent articles have described
the role of trust in mediating reliance on auto-
mation, there has been no integrative review of
these studies. The purpose of this paper is to
provide such a review, link trust in automation
to the burgeoning research on trust in other
domains, and resolve conflicting findings. We
begin by developing a conceptual model to link
organizational, sociological, interpersonal, psy-
chological, and neurological perspectives on
trust between people to human-automation trust.
We then use this conceptual model of trust and
reliance to integrate research related to human-
automation trust. The conceptual model identi-
fies important research issues, and it also identifies
design, evaluation, and training approaches to
promote appropriate trust and reliance.
TRUST AND AFFECT
Researchers from a broad range of disci-
plines have examined the role of trust in mediat-
ing relationships between individuals, between

52 Spring 2004 – Human Factors
individuals and organizations, and even between
organizations. Specifically, trust has been inves-
tigated as a critical factor in interpersonal rela-
tionships, where the focus is often on romantic
relationships (Rempel et al., 1985). In exchange
relationships, another important research area,
the focus is on trust between management and
employees or between supervisors and subor-
dinates (Tan & Tan, 2000). Trust has also been
identified as a critical factor in increasing orga-
nizational productivity and strengthening or-
ganizational commitment (Nyhan, 2000). Trust
between firms and customers has become an
important consideration in the context of rela-
tionship management (Morgan & Hunt, 1994)
and Internet commerce (Muller, 1996). Research-
ers have even considered the issue of trust in the
context of the relationship between organiza-
tions such as those in multinational firms (Ring
& Vandeven, 1992), in which cross-disciplinary
and cross-cultural collaboration is critical (Doney,
Cannon, & Mullen, 1998).
Interest in trust has grown dramatically in
the last 5 years, as many have come to recognize
its importance in promoting efficient transac-
tions and cooperation. Trust has emerged as a
central focus of organizational theory (Kramer,
1999); has been the focus of recent special issues
of the Academy of Management Review (Jones
& George, 1998) and the International Journal
of Human-Computer Studies (Corritore, Kracher,
& Wiedenbeck, 2003b); was the topic of a work-
shop at the CHI 2001 meeting (Corritore, Kra-
cher, & Wiedenbeck, 2001a); and has been the
topic of books such as that by Kramer and Tyler
(1996).
The general theme of the increasing cogni-
tive complexity of automation, organizations,
and interpersonal interactions explains the
recent interest in trust. Trust tends to be less
important in well-structured, stable environ-
ments, such as procedure-based hierarchical
organizations, in which an emphasis on order
and stability minimize transactional uncertain-
ty (Moorman, Deshpande, & Zaltman, 1993).
Many organizations, however, have recently
adopted agile structures, self-directed work
groups, matrix structures, and complex automa-
tion, all of which make the workplace increas-
ingly complex, unstable, and uncertain. Because
these changes enable rapid adaptation to change
and accommodate unanticipated variability,
there is a trend away from well-structured,
procedure-based environments. Although these
changes have the potential to make organiza-
tions and individuals more productive and able
to adapt to the unanticipated (Vicente, 1999),
they also increase cognitive complexity and
leave more degrees of freedom for the individual
to resolve. Trust plays a critical role in people’s
ability to accommodate the cognitive complexity
and uncertainty that accompanies the move
away from highly structured organizations and
simple technology.
Trust helps people to accommodate com-
plexity in several ways. It supplants supervision
when direct observation becomes impractical,
and it facilitates choice under uncertainty by
acting as a social decision heuristic (Kramer,
1999). It also reduces uncertainty in gauging the
responses of others, thereby guiding appropriate
reliance and generating a collaborative advan-
tage (Baba, 1999; Ostrom, 1998). Moreover,
trust facilitates decentralization and adaptive
behavior by making it possible to replace fixed
protocols, reporting structures, and procedures
with goal-related expectations regarding the
capabilities of others. The increased complexity
and uncertainty that has inspired the recent in-
terest in trust in other fields parallels the
increased complexity and sophistication of
automation. Trust in automation guides reliance
when the complexity of the automation makes
a complete understanding impractical and when
the situation demands adaptive behavior that
procedures cannot guide. For this reason, the
recent interest in trust in other disciplines pro-
vides a rich and appropriate theoretical base
for understanding how trust mediates reliance
on complex automation and, more generally,
how it affects computer-mediated collaboration
that involves both human and computer agents.
Definition of Trust: Beliefs, Attitudes,
Intentions, and Behavior
Not surprisingly, the diverse interest in trust
has generated many definitions. This is particu-
larly true when considering how trust relates to
automation (Cohen, Parasuraman, & Freeman,
1999; Muir, 1994). By examining the differences
and common themes of these definitions, it is

TRUST IN AUTOMATION 53
possible to identify critical considerations for
understanding the role of trust in mediating
human-automation interaction. Some research-
ers focus on trust as an attitude or expectation,
and they tend to define trust in one of the follow-
ing ways: “expectancy held by an individual that
the word, promise or written communication
of another can be relied upon” (Rotter, 1967,
p. 651); “expectation related to subjective proba-
bility an individual assigns to the occurrence of
some set of future events” (Rempel et al., 1985,
p. 96); “expectation of technically competent
role performance” (Barber, 1983, p. 14); or “ex-
pectations of fiduciary obligation and responsi-
bility, that is, the expectation that some others
in our social relationships have moral obligations
and responsibility to demonstrate a special con-
cern for others’ interests above their own” (Bar-
ber, p. 14). These definitions all include the
element of expectation regarding behaviors or
outcomes. Clearly, trust concerns an expectan-
cy or an attitude regarding the likelihood of
favorable responses.
Another common approach characterizes
trust as an intention or willingness to act. This
goes beyond attitude in that trust is character-
ized as an intention to behave in a certain man-
ner or to enter into a state of vulnerability. For
example, trust has been defined as “willingness
to place oneself in a relationship that establishes
or increases vulnerability with the reliance upon
someone or something to perform as expected”
(Johns, 1996, p. 81); “willingness to rely on an
exchange partner in whom one has confidence”
(Moorman et al., 1993, p. 82); and “willingness
of a party to be vulnerable to the actions of
another party based on the expectation that the
other will perform a particular action impor-
tant to the trustor, irrespective of the ability to
monitor or control that party” (Mayer, Davis,
& Schoorman, 1995, p. 712).
The definition by Mayer et al. (1995) is the
most widely used and accepted definition of
trust (Rousseau, Sitkin, Burt, & Camerer, 1998).
As of April 2003, the Institute for Scientific In-
formation citation database showed 203 cita-
tions of this article, far more than others on the
topic of trust. The definition identifies vulnera-
bility as a critical element of trust. For trust to
be an important part of a relationship, individ-
uals must willingly put themselves at risk or in
vulnerable positions by delegating responsibili-
ty for actions to another party.
Some authors go beyond intention and define
trust as a behavioral result or state of vulnerabil-
ity or risk (Deutsch, 1960; Meyer, 2001). Ac-
cording to these definitions, trust is the outcome
of actions that place people into certain states or
situations. It can be seen as, for example, “a state
of perceived vulnerability or risk that is derived
from an individual’s uncertainty regarding the
motives, intentions, and perspective actions of
others on whom they depend” (Kramer, 1999,
p. 571).
These definitions highlight some important
inconsistencies regarding whether trust is a
belief, attitude, intention, or behavior. These
distinctions are of great theoretical importance,
as multiple factors mediate the process of trans-
lating beliefs and attitudes into behaviors.
Ajzen and Fishbein (1980; Fishbein & Ajzen,
1975) developed a framework that can help
reconcile these conflicting definitions of trust.
Their framework shows that behaviors result
from intentions and that intentions are a func-
tion of attitudes. Attitudes in turn are based on
beliefs. According to this framework, beliefs
and perceptions represent the information base
that determines attitudes. The availability of
information and the person’s experiences influ-
ence beliefs. An attitude is an affective evalua-
tion of beliefs that guides people to adopt a
particular intention. Intentions then translate
into behavior, according to the environmental
and cognitive constraints a person faces. In the
context of trust and reliance, trust is an attitude
and reliance is a behavior. This framework keeps
beliefs, attitudes, intentions, and behavior con-
ceptually distinct and can help explain the in-
fluence of trust on reliance. According to this
framework, trust affects reliance as an attitude
rather than as a belief, intention, or behavior.
Beliefs underlie trust, and various intentions
and behaviors may result from different levels
of trust.
Considering trust as an intention or behavior
has the potential to confuse its effect with the
effects of other factors that can influence behav-
ior, such as workload, situation awareness, and
self-confidence of the operator (Lee & Moray,
1994; Riley, 1994). Trust is not the only factor
mediating the relationship between beliefs and

54 Spring 2004 – Human Factors
behavior. Other psychological, system, and en-
vironmental constraints intervene, such as when
operators do not have enough time to engage
the automation even though they trust it and
intend to use it, or when the effort to engage
the automation outweighs its benefits (Kirlik,
1993). Trust stands between beliefs about the
characteristics of the automation and the inten-
tion to rely on the automation.
Many definitions of trust also indicate the
importance of the goal-oriented nature of trust.
Although many definitions do not identify this
aspect of trust explicitly, several mention the
ability of the trustee to perform an important ac-
tion and the expectation of the trustor that the
trustee will perform as expected or can be relied
upon (Gurtman, 1992; Johns, 1996; Mayer et
al., 1995). These definitions describe the basis
of trust in terms of the performance of an agent,
the trustee, who furthers the goals of an indi-
vidual, the trustor. In this way trust describes a
relationship that depends on the characteristics
of the trustee, the trustor, and the goal-related
context of the interaction. Trust is not a consid-
eration in situations where the trustor does not
depend on the trustee to perform some function
related to the trustors goals.
A simple definition of trust consistent with
these considerations is the attitude that an
agent will help achieve an individual’s goals in
a situation characterized by uncertainty and
vulnerability. This basic definition must be
elaborated to consider the appropriateness of
trust, the influence of context, the goal-related
characteristics of the agent, and the cognitive
processes that govern the development and
erosion of trust. Figure 1 shows how these fac-
tors interact in a dynamic process of reliance,
and the following sections elaborate on various
components of this conceptual model.
First, we will consider the appropriateness
of trust. In Figure 1, appropriateness is shown
Figure 1. The interaction of context, agent characteristics, and cognitive properties with the appropriateness of trust.

Citations
More filters
Journal ArticleDOI

Trust in Automation: Integrating Empirical Evidence on Factors That Influence Trust

TL;DR: A three-layered trust model provides a new lens for conceptualizing the variability of trust in automation and can be applied to help guide future research and develop training interventions and design procedures that encourage appropriate trust.
Journal ArticleDOI

A meta-analysis of factors affecting trust in human-robot interaction.

TL;DR: Factors related to the robot itself, specifically, its performance, had the greatest current association with trust, and environmental factors were moderately associated; there was little evidence for effects of human-related factors.
Journal ArticleDOI

Models of man

TL;DR: In this article, a book is one of the greatest friends to accompany while in your lonely time and when you have no friends and activities, reading book can be a great choice.
Journal ArticleDOI

The Logic and Limits of Trust

TL;DR: Books and internet are the recommended media to help you improving your quality and performance.
Journal ArticleDOI

Complacency and bias in human use of automation: an attentional integration.

TL;DR: An integrated model of complacency and automation bias shows that they result from the dynamic interaction of personal, situational, and automation-related characteristics, with attention playing a central role.
References
More filters
Book

Judgment Under Uncertainty: Heuristics and Biases

TL;DR: The authors described three heuristics that are employed in making judgements under uncertainty: representativeness, availability of instances or scenarios, and adjustment from an anchor, which is usually employed in numerical prediction when a relevant value is available.
Book

Understanding Attitudes and Predicting Social Behavior

TL;DR: In this paper, the author explains "theory and reasoned action" model and then applies the model to various cases in attitude courses, such as self-defense and self-care.
Journal ArticleDOI

The Commitment-Trust Theory of Relationship Marketing

TL;DR: Relationship marketing, established, developing, and maintaining successful relational exchanges, constitutes a major shift in marketing theory and practice as mentioned in this paper, after conceptualizing relationship relationships as a set of relationships.
Journal ArticleDOI

An Integrative Model Of Organizational Trust

TL;DR: In this paper, a definition of trust and a model of its antecedents and outcomes are presented, which integrate research from multiple disciplines and differentiate trust from similar constructs, and several research propositions based on the model are presented.
Journal ArticleDOI

Self-efficacy mechanism in human agency

TL;DR: The centrality of the self-efficacy mechanism in human agency is discussed in this paper, where the influential role of perceived collective effi- cacy in social change is analyzed, as are the social con- ditions conducive to development of collective inefficacy.
Related Papers (5)
Frequently Asked Questions (13)
Q1. What are the contributions in "Trust in automation: designing for appropriate reliance" ?

Although this paper describes reliance on automation as a discrete process of engaging or disengaging, automation can be a very complex combination of many modes, and reliance is often a more graded process. 

Trust plays a critical role in people’s ability to accommodate the cognitive complexity and uncertainty that accompanies the move away from highly structured organizations and simple technology. 

Although temporal specificity implies a generic change over time as the person’s trust adjusts to failures with the automation, temporal specificity also addresses adjustments that should occur when the situation or context changes and affects the capability of the automation. 

Trust tends to be less important in well-structured, stable environments, such as procedure-based hierarchical organizations, in which an emphasis on order and stability minimize transactional uncertainty (Moorman, Deshpande, & Zaltman, 1993). 

Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation. 

The degree to which the organizational context affects the indirect development of trust depends on the strength of links in the social network and on the trustor’s ability to establish links between the situations experienced by others and the situations he or she is confronting (Doney et al., 1998). 

More generally, cultural differences associated with power distance (e.g., dependence on authority and respect for authoritarian norms), uncertainty avoidance, and individualist and collectivist attitudes can influence the development and role of trust (Doney et al., 1998). 

Many organizations, however, have recently adopted agile structures, self-directed work groups, matrix structures, and complex automation, all of which make the workplace increasingly complex, unstable, and uncertain. 

Because these changes enable rapid adaptation to changeand accommodate unanticipated variability, there is a trend away from well-structured, procedure-based environments. 

The influence of individual differences regarding the predisposition to trust is most important when a situation is ambiguous and generalized expectancies dominate, and it becomes less important as the relationship progresses (McKnight, Cummings, & Chervany, 1998). 

Trust between firms and customers has become an important consideration in the context of relationship management (Morgan & Hunt, 1994) and Internet commerce (Muller, 1996). 

The general theme of the increasing cognitive complexity of automation, organizations, and interpersonal interactions explains the recent interest in trust. 

These observations demonstrate that trust is an attitude toward automation that affects reliance and that it can be measured consistently.