Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jan 26:7:591448.
doi: 10.3389/frobt.2020.591448. eCollection 2020.

Development and Testing of Psychological Conflict Resolution Strategies for Assertive Robots to Resolve Human-Robot Goal Conflict

Affiliations

Development and Testing of Psychological Conflict Resolution Strategies for Assertive Robots to Resolve Human-Robot Goal Conflict

Franziska Babel et al. Front Robot AI. .

Abstract

As service robots become increasingly autonomous and follow their own task-related goals, human-robot conflicts seem inevitable, especially in shared spaces. Goal conflicts can arise from simple trajectory planning to complex task prioritization. For successful human-robot goal-conflict resolution, humans and robots need to negotiate their goals and priorities. For this, the robot might be equipped with effective conflict resolution strategies to be assertive and effective but similarly accepted by the user. In this paper, conflict resolution strategies for service robots (public cleaning robot, home assistant robot) are developed by transferring psychological concepts (e.g., negotiation, cooperation) to HRI. Altogether, fifteen strategies were grouped by the expected affective outcome (positive, neutral, negative). In two online experiments, the acceptability of and compliance with these conflict resolution strategies were tested with humanoid and mechanic robots in two application contexts (public: n 1 = 61; private: n 2 = 93). To obtain a comparative value, the strategies were also applied by a human. As additional outcomes trust, fear, arousal, and valence, as well as perceived politeness of the agent were assessed. The positive/neutral strategies were found to be more acceptable and effective than negative strategies. Some negative strategies (i.e., threat, command) even led to reactance and fear. Some strategies were only positively evaluated and effective for certain agents (human or robot) or only acceptable in one of the two application contexts (i.e., approach, empathy). Influences on strategy acceptance and compliance in the public context could be found: acceptance was predicted by politeness and trust. Compliance was predicted by interpersonal power. Taken together, psychological conflict resolution strategies can be applied in HRI to enhance robot task effectiveness. If applied robot-specifically and context-sensitively they are accepted by the user. The contribution of this paper is twofold: conflict resolution strategies based on Human Factors and Social Psychology are introduced and empirically evaluated in two online studies for two application contexts. Influencing factors and requirements for the acceptance and effectiveness of robot assertiveness are discussed.

Keywords: HRI strategies; acceptance; persuasive robots; robot assertiveness; trust; user compliance.

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Figures

FIGURE 1
FIGURE 1
Schematic presentation of participant’s decision page in the questionnaire.
FIGURE 2
FIGURE 2
Screenshots from robot videos. Each video lasted about 10 s and depicted the entity driving/walking towards the viewer in a neutral hallway. Robots and agent shown in Study 1 (A)(D) and in Study 2 (C)–(E). Stimuli videos can be found in the supplementary material.
FIGURE 3
FIGURE 3
Robot ratings in the public context (top) and private context (bottom).
FIGURE 4
FIGURE 4
Compliance categories per use context. Public context (top) and private context (bottom).
FIGURE 5
FIGURE 5
Acceptance ratings per strategy and use context. Error bars indicate ±2 standard errors of the mean.

References

    1. Adam H., Shirako A. (2013). Not all anger is created equal: the impact of the expresser’s culture on the social effects of anger in negotiations. J. Appl. Psychol. 98, 785–798. 10.1037/a0032387 - DOI - PubMed
    1. Albert S., Dabbs J. M. (1970). Physical distance and persuasion. J. Pers. Soc. Psychol. 15, 265–270. 10.1037/h0029430 - DOI - PubMed
    1. Argyle M., Dean J. (1965). Eye-contact, distance and affiliation. Sociometry, 29, 289–304. - PubMed
    1. Babel F., Kraus J., Miller L., Kraus M., Wagner N., Minker W., et al. (2021). Small talk with a robot? The impact of dialog content, talk initiative, and gaze behavior of a social robot on trust, acceptance, and proximity. Int. J. Soc. Robot. 10.1007/s12369-020-00730-0 - DOI
    1. Bartneck C., Hu J. (2008). Exploring the abuse of robots. Interact. Stud. Soc. Behav. Commun. Biol. Artif. Syst. 9, 415–433. 10.1075/is.9.3.04bar - DOI