Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Aug 5;9(3):e34514.
doi: 10.2196/34514.

Consumer Perspectives on the Use of Artificial Intelligence Technology and Automation in Crisis Support Services: Mixed Methods Study

Affiliations

Consumer Perspectives on the Use of Artificial Intelligence Technology and Automation in Crisis Support Services: Mixed Methods Study

Jennifer S Ma et al. JMIR Hum Factors. .

Abstract

Background: Emerging technologies, such as artificial intelligence (AI), have the potential to enhance service responsiveness and quality, improve reach to underserved groups, and help address the lack of workforce capacity in health and mental health care. However, little research has been conducted on the acceptability of AI, particularly in mental health and crisis support, and how this may inform the development of responsible and responsive innovation in the area.

Objective: This study aims to explore the level of support for the use of technology and automation, such as AI, in Lifeline's crisis support services in Australia; the likelihood of service use if technology and automation were implemented; the impact of demographic characteristics on the level of support and likelihood of service use; and reasons for not using Lifeline's crisis support services if technology and automation were implemented in the future.

Methods: A mixed methods study involving a computer-assisted telephone interview and a web-based survey was undertaken from 2019 to 2020 to explore expectations and anticipated outcomes of Lifeline's crisis support services in a nationally representative community sample (n=1300) and a Lifeline help-seeker sample (n=553). Participants were aged between 18 and 93 years. Quantitative descriptive analysis, binary logistic regression models, and qualitative thematic analysis were conducted to address the research objectives.

Results: One-third of the community and help-seeker participants did not support the collection of information about service users through technology and automation (ie, via AI), and approximately half of the participants reported that they would be less likely to use the service if automation was introduced. Significant demographic differences were observed between the community and help-seeker samples. Of the demographics, only older age predicted being less likely to endorse technology and automation to tailor Lifeline's crisis support service and use such services (odds ratio 1.48-1.66, 99% CI 1.03-2.38; P<.001 to P=.005). The most common reason for reluctance, reported by both samples, was that respondents wanted to speak to a real person, assuming that human counselors would be replaced by automated robots or machine services.

Conclusions: Although Lifeline plans to always have a real person providing crisis support, help-seekers automatically fear this will not be the case if new technology and automation such as AI are introduced. Consequently, incorporating innovative use of technology to improve help-seeker outcomes in such services will require careful messaging and assurance that the human connection will continue.

Keywords: acceptability; artificial intelligence; community; consumer; crisis; help-seeker; perspective; support; technology.

PubMed Disclaimer

Conflict of interest statement

Conflicts of Interest: None declared.

Figures

Figure 1
Figure 1
Levels of community (n=1268) and help-seeker (n=426) participants’ support in the use of technology and automation to tailor Lifeline’s crisis support service.
Figure 2
Figure 2
Likelihood of community (n=1247) and help-seeker (n=426) participants using Lifeline if technology and automation were used.
Figure 3
Figure 3
Reasons for community (n=595) and help-seeker (n=200) participants not using the Lifeline crisis support service if technology and automation were used—open-ended.

References

    1. The Fourth Industrial Revolution: what it means, how to respond. World Economic Forum. [2022-03-19]. https://www.weforum.org/agenda/2016/01/the-fourth-industrial-revolution-...
    1. Artificial Intelligence (AI) IBM. [2022-03-19]. https://www.ibm.com/cloud/learn/what-is-artificial-intelligence .
    1. What consumers really think about AI: a global study. Pegasystems Inc. [2021-03-17]. https://www.pega.com/ai-survey#:~:text=A%20new%20global%20study%20reveal... .
    1. An M. Artificial Intelligence is here - people just don't realise it. HubSpot. [2021-03-17]. https://blog.hubspot.com/marketing/artificial-intelligence-is-here .
    1. Meskó B, Hetényi G, Győrffy Z. Will artificial intelligence solve the human resource crisis in healthcare? BMC Health Serv Res. 2018 Jul 13;18(1):545. doi: 10.1186/s12913-018-3359-4. https://bmchealthservres.biomedcentral.com/articles/10.1186/s12913-018-3... 10.1186/s12913-018-3359-4 - DOI - DOI - PMC - PubMed

LinkOut - more resources