Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2024 Sep 11;124(17):9899-9948.
doi: 10.1021/acs.chemrev.4c00049. Epub 2024 Aug 28.

Toward an AI Era: Advances in Electronic Skins

Affiliations
Review

Toward an AI Era: Advances in Electronic Skins

Xuemei Fu et al. Chem Rev. .

Abstract

Electronic skins (e-skins) have seen intense research and rapid development in the past two decades. To mimic the capabilities of human skin, a multitude of flexible/stretchable sensors that detect physiological and environmental signals have been designed and integrated into functional systems. Recently, researchers have increasingly deployed machine learning and other artificial intelligence (AI) technologies to mimic the human neural system for the processing and analysis of sensory data collected by e-skins. Integrating AI has the potential to enable advanced applications in robotics, healthcare, and human-machine interfaces but also presents challenges such as data diversity and AI model robustness. In this review, we first summarize the functions and features of e-skins, followed by feature extraction of sensory data and different AI models. Next, we discuss the utilization of AI in the design of e-skin sensors and address the key topic of AI implementation in data processing and analysis of e-skins to accomplish a range of different tasks. Subsequently, we explore hardware-layer in-skin intelligence before concluding with an analysis of the challenges and opportunities in the various aspects of AI-enabled e-skins.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing financial interest.

Figures

Figure 1
Figure 1
Potential of AI in enabling e-skin designs spanning materials, sensors, sensing performance, and functionality (e.g., self-healing, biodegradability). AI can also enable e-skin applications, such as in surgical operations performed by medical robots, through the analysis of sensory data, and the output of feedback information after decision-making.
Figure 2
Figure 2
Evolution of e-skins from prototypical to functional to intelligent. ML, machine learning; DL, deep learning; CNN, convolutional neural network; and ANN, artificial neural network. Images reproduced with permissions: “Artificial touch in hand-prosthesis.” Copyright 1967 Springer Nature. “Sensitive skin: infrared sensor array on robot arm.” Copyright 2005 John Wiley and Sons. “Flexible active-matrix e-skin.” Copyright (2004) National Academy of Sciences, U.S.A. “Microstructured pressure sensor.” Copyright 2010 Springer Nature. “Epidermal e-skin.” Copyright 2011 The American Association for the Advancement of Science. “Stretchable transparent e-skin.” Copyright 2011 Springer Nature. “Strain sensor for sound recognition using ANN.” Copyright 2015 Springer Nature. “Multiplexed wearable perspiration analysis.” Copyright 2016 Springer Nature. “Artificial fingertip for roughness discrimination using ML.” Copyright 2017 Elsevier. “Self-healable e-skin system.” Copyright 2018 Springer Nature. “Ultrafast, asynchronous multimodal tactile encoding.” Copyright 2019 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. “Tactile glove for object recognition using CNN.” Copyright 2019 Springer Nature. “Multimodal e-skin for garbage sorting using ML.” Copyright 2020 The American Association for the Advancement of Science. “Flexible chip with embedded ML.” Copyright 2020 Springer Nature. “AR/VR haptic glove enabled by AI.” From ref (39). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. “Strain sensor for sign-to-speech translation using ML.” Copyright 2020 Springer Nature. “Acoustic biometric authentication using ML.” From ref (45). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/”. Reprinted with permission from AAAS. “Triboelectric e-skin for pulse pressure prediction using DL.” Copyright 2021 John Wiley and Sons. “E-skin for hand task recognition using meta-learning.” Copyright 2022 Springer Nature. “Triboelectricity for materials identification using ML.” From ref (52). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. “Synaptic transistor for in-skin learning.” Copyright 2022 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. “Stretchable e-skin for deformation reconstruction using AI.” Copyright 2023 Springer Nature. “Ultrathin memristor-based 3D space writing recognition.” Copyright 2023 Springer Nature.
Figure 3
Figure 3
Overview of the sensors, outputs, and desired attributes of e-skins. Sensors: Image representing tactile was reproduced with permission from ref (72). Copyright 2020 Springer Nature under a CC BY 4.0 license. Image representing temperature was reproduced with permission from ref (73). Copyright 2022 Royal Society of Chemistry. Image representing chemical from ref (60). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. Image representing electrophysiological was reproduced with permission from ref (74). Copyright 2020 Springer Nature under CC BY 4.0 license. Image representing optical sensors was reproduced with permission from ref (75). Copyright 2021 John Wiley and Sons. Outputs: Image representing thermoregulation was reproduced with permission from ref (76). Copyright 2022 Springer Nature under CC BY 4.0 license. Image representing visual displays from ref (77). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. Image representing haptics was reproduced with permission from ref (78). Copyright 2022 Springer Nature. Features: Images representing multimodality was reproduced with permission from ref (79). Copyright 2018 Springer Nature under a CC BY 4.0 license. Image representing self-healing was reproduced with permission from ref (80). Copyright 2022 John Wiley and Sons. Image representing imperceptibility was reproduced with permission from ref (81). Copyright 2023 Copyright 2023 Elsevier. Image representing wireless communications was reproduced with permission from ref (82). Copyright 2022 American Association for the Advancement of Science. Applications: Images representing e-skins for prosthetics was reproduced with permission from ref (78). Springer Nature. Image representing wearables was reproduced with permission from ref (61). Copyright 2023 American Association for the Advancement of Science. Image representing robotics was reproduced with permission from ref (83). Copyright 2022 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science.
Figure 4
Figure 4
Tactile sensing. (a) Illustration of the structure of PIEZO2 proteins found in mechanosensitive ion channels in human cells. Image was reproduced with permission from ref (88). Copyright 2019 Springer Nature. (b) Schematic showing PIEZO2 ion channels expressed in Merkel cells. Image was reproduced with permission from ref (1). Copyright 2021 Springer Nature. (c) Schematic of a capacitive pressure sensor with porous microstructured dielectric. Image was reproduced with permission from ref (110). Copyright 2019 American Chemical Society. (d) Schematic showing reconnection of conductive paths for a piezoresistive sensor under compression. Image was reproduced with permission from ref (40). Copyright Copyright 2020. Published under the PNAS license. (e) Schematic of a piezoelectric sensor. Image was reproduced with permission from ref (114). Copyright 2022 John Wiley and Sons. (f) Schematic showing the operation of a triboelectric sensor in single electrode mode. Image was reproduced with permission from ref (102). Copyright 2022 John Wiley and Sons under CC BY 4.0 license.
Figure 5
Figure 5
Sensing mechanisms of other stimuli in e-skins. (a) Temperature: schematic showing increased ionic conductivity of an ionic conductor with temperature. Image was reproduced with permission from ref (125). Copyright 2020 American Chemical Society. (b) Chemical: schematic of a glucose sensing patch based on reduction of H2O2 at the working electrode: (i) working electrode, (ii) counter electrode, (iii) reference electrode, (iv) iontophoretic anode, and (v) current collector. Image was reproduced with permission from ref (136). Copyright 2022 Elsevier. (c) Electrophysiological: circuit design for different channels and functions of electrically compensated electrodes. From ref (142). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. (d) Optical: schematic of an ultrathin polymer LED and organic photodetector used for photoplethysmography. Image was reproduced with permission from ref (147). Copyright 2021 Springer Nature under CC BY 4.0 license.
Figure 6
Figure 6
E-skin outputs. (a) Thermoregulation: exploded view of a thermally controlled epidermal VR system. Image was reproduced with permission from ref (150). Copyright Copyright 2023 the Author(s). Published by PNAS under CC BY-NC-ND 4.0 license. (b) Haptics: photograph of a multimodal haptic glove. Image was reproduced with permission from ref (162). Copyright 2020 John Wiley and Sons under CC BY 4.0 license. (c) Displays: photograph of skin-like healthcare patch on human hand with conformal contact and schematic layout of LED pixel. From ref (77). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS.
Figure 7
Figure 7
E-skin features. (a) Multimodal: Schematic of the sensor for multimode tactile and temperature sensing. Image was reproduced with permission from ref (195). Copyright 2020 The American Association for the Advancement of Science. (b) Self-healing: Series of optical images showing a pristine capacitor, damaged by a cut through all of the layers, and healed layers. Image was reproduced with permission from ref (203). Copyright 2023 The American Association for the Advancement of Science. (c) Imperceptible: Photographs and schematic of a resistive tactile sensor and transistor on a fingertip. Image was reproduced with permission from ref (209). Copyright 2022 John Wiley and Sons. (d) Wireless: Optical photograph of an antenna formed by copper traces with circuit components encircled within. From ref (211). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS.
Figure 8
Figure 8
Workflow of AI implementation in e-skin applications.
Figure 9
Figure 9
Summary of neural networks applied in recently reported AI-enabled e-skins (refs (, , , , , , , , −, , , −, and 304)) human senses (taste, hearing, touch, smell, and sight), human brain, and outstanding models.,− As data from e-skins grew and multimodal interactions became more intricate, the limitations of MLPs became evident. CNNs emerged as powerful tools for autoanalysis. Attention-based models later demonstrated their versatility in capturing contextual relationships in sequential data, enabling more complex tasks for e-skin applications. However, there is still a noticeable gap between present AI-enabled e-skins and humans in both sensing and cognition. Images were reproduced with permission from ref (49), Copyright 2022 Springer Nature; ref (56), Copyright 2023 Springer Nature; ref (36), Copyright 2019 Springer Nature; and ref (267), Copyright 2023 Springer Nature.
Figure 10
Figure 10
AI-driven materials discovery. (a) Optimization of flexible Ag/PAA composite films for different applications (i.e., electrodes and resistive sensing materials). Image was reproduced with permission from ref. Copyright 2020 Royal Society of Chemistry. (b) Discovery and optimization of BaTiO3 compounds for large electrostrain via active learning in the material design space with around 605,000 compositions. Image was reproduced with permission from ref (336). Copyright 2018 John Wiley and Sons. (c) A microstructure design space of BaTiO3 nanofiller in the BaTiO3/PVA piezoelectric composite with 400 microstructures and schematics of 10 representative microstructures. This was for the theoretical analysis and optimization of nanofillers in the composites. Image was reproduced with permission from ref (337). Copyright 2022 John Wiley and Sons under CC BY-NC 4.0 license. (d) AI model combining dynamic EFM model and static CNN model to predict and understand the toughness evolution of an intrinsic self-healing polymer over time with an initial single-cut image as the input. Image was reproduced with permission from ref (340). Copyright 2022 American Chemical Society.
Figure 11
Figure 11
AI-driven sensor design. (a) Sensor design using virtual data for training. (i) Electrode layout of a simulated capacitive e-skin sensor array with 64 electrodes. (ii) The reconstruction performance of the AI model under four electrode layouts. (iii) The schematic of a physical capacitive e-skin sensor array with 32 electrodes. (iv) Deformation reconstruction process from data collection to point cloud representation. Image was reproduced with permission from ref (56). Copyright 2023 Springer Nature. (b) Sensor design using real data from experiments for training. (i) Schematics and resistance–strain profiles of strain sensors with different film composition (left) and film microstructure (right). (ii) Schematic of a navigation model to progressively explore a strain sensor design space through 12 active learning loops where 125 strain sensors were stepwise fabricated to input data for training. Image was reproduced with permission from ref (53). Copyright 2022 Spinger Nature.
Figure 12
Figure 12
AI-empowered pressure-sensitive e-skin applications. (a) High-accuracy texture recognition using CNN through simple one-touch with a crack-based high-density, low-hysteresis pressure sensor array. Image was reproduced with permission from ref (40). Copyright 2020. Published under the PNAS license. (b) Intelligent disease diagnostic system based on multidimensional feature extraction of pulse pressure signals using random forest classifier. Image was reproduced with permission from ref (59). Copyright 2023 The Authors. Published by American Chemical Society. (c) Object recognition using CNN with a glove equipped with 548 pressure sensors covering the entire hand. Image was reproduced with permission from ref (36). Copyright 2019 Springer Nature. (d) Super-resolution tactile e-skin assisted by MLP. Image was reproduced with permission from ref (50). Copyright 2022 The American Association for the Advancement of Science.
Figure 13
Figure 13
AI-empowered strain-sensitive e-skin applications. (a) Sign-to-speech translation through finger-attached strain sensors and SVM. Image was reproduced with permission from ref (42). Copyright 2020 Spinger Nature. (b) Unsupervised rapid hand task recognition through substrate-less nanomesh strain sensor on fingers and TD-C learning. Image was reproduced with permission from ref (49). Copyright 2022 Springer Nature. (c) Classification and recognition of throat activities through hierarchically resistive strain sensor attached on throat and algorithms of MLP and CNN. Image was reproduced with permission from ref (58). Copyright 2023 Springer Nature.
Figure 14
Figure 14
AI-empowered thermosensitive e-skin applications. (a) Core body temperature quantification through wearable temperature sensors and regression models. Image was reproduced with permission from ref (381). Copyright 2023 John Wiley and Sons under CC BY-NC 4.0 license. (b) Object recognition through thermal based pressure sensing and thermal conductivity sensing as well as MLP. Image was reproduced with permission from ref (37). Copyright 2020 The American Association for the Advancement of Science.
Figure 15
Figure 15
AI-empowered nanogenerator-based e-skin applications. (a) Mobile acoustic sensing using piezoelectric sensors for voice recognition and biometric authentication through ML. From ref (45). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. (b) Human motion prediction based on triboelectric sensor and PCA-assisted ML. Image was reproduced with permission from ref (214). Copyright 2023 Springer Nature under CC BY 4.0 license. (c) Blood pressure estimation based on a textile triboelectric sensor and a specially developed regression model. Image was reproduced with permission from ref (48). Copyright 2021 John Wiley and Sons. (d) Material identification based on triboelectric sensing and LDA. From ref (52). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS.
Figure 16
Figure 16
AI-empowered multimodal e-skin applications. (a) Detect, classify, and discriminate multimodal stimuli of strain, flexion, pressure, and temperature based on a stretchable cross-reactive sensor matrix and BoW model. Image was reproduced with permission from ref (43). Copyright 2020 John Wiley and Sons. (b) Wound monitoring enabled by a multiplexed sensor patch and CNN. From ref (60). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. (c) Virtual shop applications enabled by multimodal sensing and multiple ML models. Image was reproduced with permission from ref (417). Copyright 2021 John Wiley and Sons under CC BY 4.0 license.
Figure 17
Figure 17
In-skin intelligent e-skin systems. (a) A neuroinspired event-based asynchronous encoding system for e-skin signal transmission. Image was reproduced with permission from ref (35). Copyright 2019 American Association for the Advancement of Science. (b) Low-voltage-driven artificial soft e-skin system enabled biomimetic bidirectional signal transmission. Image was reproduced with permission from ref (61). Copyright 2023 American Association for the Advancement of Science. (c) In-sensor tactile position encoding using spikes tuned by the ion electron relaxation effect. Image was reproduced with permission from ref (54). Copyright 2022 The American Association for the Advancement of Science.
Figure 18
Figure 18
Hardware-layer neural networks based on artificial synaptic devices for in-skin data processing and analysis. (a) Integrating flexible synaptic transistor array (i) with stretchable resistive sensors for simulated sign language translation (ii). From ref (247). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. (b) Integrating ultrathin synaptic memristor array with two ultrathin organic photodiodes for simulated finger writing recognition. (i) Synaptic behaviors of the memristor on a hand-like replica. (ii) The photodiodes attached on an index finger. The three-dimensional curve of time-resolved output voltage changes during finger motion was transformed to a 2-dimensional image for digit 3. (iii) Simulated finger writing recognition process using the 2-dimensional image and the memristor array. Image was reproduced with permission from ref (57). Copyright 2023 Springer Nature.
Figure 19
Figure 19
In-skin adaptivity for robots enabled by artificial synaptic devices. (a) Integrating stretchable synaptic transistors with stretchable triboelectric sensors for programmable control of a pneumatic soft robot for adaptive locomotion. From ref (443). Copyright The Authors, some rights reserved; exclusive licensee AAAS. Distributed under a CC BY-NC 4.0 license http://creativecommons.org/licenses/by-nc/4.0/. Reprinted with permission from AAAS. (b) In-skin teaching and learning of a robotic hand to acquire a pain reflex based on an artificial tactile neural pathway. (i) Illustration of the artificial tactile neural pathway. (ii) Input spiking signals (left) and postsynaptic currents (right) of the synaptic transistor before and after involving a teacher signal. (iii) The adaptive motion (withdrawal for self-protection) after teaching and learning. Image was reproduced with permission from ref (55). Copyright 2022 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science.
Figure 20
Figure 20
Future directions of AI-enabled e-skins. Articles on e-skins with AI have surged by over 260%, adding more than 5,000 in under five years since 2020.

Similar articles

Cited by

References

    1. Handler A.; Ginty D. D. The Mechanosensory Neurons of Touch and Their Mechanisms of Activation. Nat. Rev. Neurosci. 2021, 22, 521–537. 10.1038/s41583-021-00489-x. - DOI - PMC - PubMed
    1. Zimmerman A.; Bai L.; Ginty D. D. The Gentle Touch Receptors of Mammalian Skin. Science 2014, 346, 950–954. 10.1126/science.1254229. - DOI - PMC - PubMed
    1. McGlone F.; Reilly D. The Cutaneous Sensory System. Neurosci. Biobehav. Rev. 2010, 34, 148–159. 10.1016/j.neubiorev.2009.08.004. - DOI - PubMed
    1. Erzurumlu R. S.; Murakami Y.; Rijli F. M. Mapping the Face in the Somatosensory Brainstem. Nat. Rev. Neurosci. 2010, 11, 252–263. 10.1038/nrn2804. - DOI - PMC - PubMed
    1. Johansson R. S.; Flanagan J. R. Coding and Use of Tactile Signals from the Fingertips in Object Manipulation Tasks. Nat. Rev. Neurosci. 2009, 10, 345–359. 10.1038/nrn2621. - DOI - PubMed

LinkOut - more resources