Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Jun;7(67):eabn0495.
doi: 10.1126/scirobotics.abn0495. Epub 2022 Jun 1.

All-printed soft human-machine interface for robotic physicochemical sensing

Affiliations

All-printed soft human-machine interface for robotic physicochemical sensing

You Yu et al. Sci Robot. 2022 Jun.

Abstract

Ultrasensitive multimodal physicochemical sensing for autonomous robotic decision-making has numerous applications in agriculture, security, environmental protection, and public health. Previously reported robotic sensing technologies have primarily focused on monitoring physical parameters such as pressure and temperature. Integrating chemical sensors for autonomous dry-phase analyte detection on a robotic platform is rather extremely challenging and substantially underdeveloped. Here, we introduce an artificial intelligence-powered multimodal robotic sensing system (M-Bot) with an all-printed mass-producible soft electronic skin-based human-machine interface. A scalable inkjet printing technology with custom-developed nanomaterial inks was used to manufacture flexible physicochemical sensor arrays for electrophysiology recording, tactile perception, and robotic sensing of a wide range of hazardous materials including nitroaromatic explosives, pesticides, nerve agents, and infectious pathogens such as SARS-CoV-2. The M-Bot decodes the surface electromyography signals collected from the human body through machine learning algorithms for remote robotic control and can perform in situ threat compound detection in extreme or contaminated environments with user-interactive tactile and threat alarm feedback. The printed electronic skin-based robotic sensing technology can be further generalized and applied to other remote sensing platforms. Such diversity was validated on an intelligent multimodal robotic boat platform that can efficiently track the source of trace amounts of hazardous compounds through autonomous and intelligent decision-making algorithms. This fully printed human-machine interactive multimodal sensing technology could play a crucial role in designing future intelligent robotic systems and can be easily reconfigured toward numerous practical wearable and robotic applications.

PubMed Disclaimer

Conflict of interest statement

Competing interests: Authors declare that they have no competing interests.

Figures

Fig. 1.
Fig. 1.. Artificial intelligence (AI)-powered multimodal sensing robotic system (M-Bot) based on a fully-printed soft human-machine interface.
(A) Schematic of the M-Bot that contains a pair of fully-printed soft electronic skins (e-skins): e-skin-H (interfacing with the human skin) and e-skin-R (interfacing with the robotic skin) for AI-powered robotic control and multimodal physicochemical sensing with user-interactive feedback. LPS, laser proximity sensor; sEMG, surface electromyography; T, temperature; KNN, K-nearest neighbors algorithm. (B and C) Photographs of the robotic skin-interfaced e-skin-R consisting of arrays of printed multimodal sensors. Scale bars, 3 cm. (D) Schematic illustration of rapid, scalable, and cost-effective prototyping of the kirigami soft e-skin-R using inkjet printing and automatic cutting. PI, polyimide. (E) Photograph of the human skin–interfaced soft e-skin-H with arrays of sEMG and feedback stimulation electrodes. Scale bar, 1 cm. (F) Schematic signal-flow diagram of the M-Bot. In-Amp, instrumentation amplifier; HPF, high-pass filter; E, applied voltage; ES, electrical stimulation; SPU, signal processing unit. WE, CE, and RE represent working, counter, and reference electrodes of the printed chemical sensor, respectively.
Fig. 2.
Fig. 2.. Characterization of the fully inkjet-printed multimodal sensor arrays on the e-skin-R.
(A) Photograph of a multimodal flexible sensor array printed with custom nanomaterial inks that consists of a temperature sensor, a tactile sensor, and an electrochemical sensor coated with a soft analyte-sampling hydrogel film. Scale bar, 5 mm. (B and C) Schematic (B) and scanning electron microscopy (SEM) image (C) of the printed AgNWs/N-PDMS tactile sensor. Scale bar, 1 μm. (D and E) Response of a tactile sensor under varied pressure loads (D) and repetitive pressure loading (E). (F and G) Schematic (F) and SEM (G) of the printed Pt-graphene electrode for TNT detection. Scale bar, 4 μm. (H) Cyclic voltammograms (CVs) of an IPCE and a printed Pt-graphene electrode in 0.5 M H2SO4 and in 5 mM K3Fe(CN)6 (inset). j, current density. (I) nDPV voltammograms and the calibration plots (inset) of TNT detection using a Pt-graphene electrode. (J) Dynamics of robotic fingertip detection of dry-phase TNT using a Pt-graphene sensor. (K and L) Schematic (K) and SEM image (L) of the printed MOF-808/Au electrode for OP detection. Scale bar, 100 nm. (M) CVs of an IPCE, a Au electrode, and a MOF-808/Au electrode in McIlvaine buffer and in 5 mM K3Fe(CN)6 (inset). (N) nDPV voltammograms of the OP detection. Inset, the calibration plots. (O) Robotic fingertip detection of dry-phase OP using a MOF-808/Au sensor. (P and Q) Schematic (P) and SEM image (Q) of the printed CNT electrode for SARS-CoV-2 detection. Scale bar, 250 nm. (R) DPV voltammograms of a printed CNT electrode in 5 mM K3Fe(CN)6 after each surface immobilization step. EDC, 1-ethyl-3-(3-dimethylamonipropyl)carbodiimide; AbC, capture antibody; BSA, bovine serum albumin. (S) Calibration plots of the CNT-based sensor for S1 detection. Δj, percentage DPV peak current changes after target incubation. (T) Response of a CNT sensor in the presence and absence of dry-phase S1. All error bars represent the s.d. from 3 sensors.
Fig. 3.
Fig. 3.. Evaluation of the e-skin-H for AI-assisted human-machine interaction.
(A) Schematic of machine learning-enabled human gesture recognition and robotic control. (B and C) Schematic (B) and photograph (C) of a PDMS encapsulated soft e-skin-H with sEMG and electrical stimulation electrodes for closed-loop human interactive robotic control. Scale bar, 1 cm. (D) sEMG data collected by the four-channel e-skin-H from 6 human gestures. (E) Classification confusion matrix using a KNN model based on real-time experimental data. White text values, percentages of correct predictions; red text values, percentages of incorrect predictions. (F) A SHAP decision plot explaining how a KNN model arrives at each final classification for every datapoint using all 5 features. Each decision line tracks the features contributions to every individual classification; each final classification is represented as serialized integers (that map to a hand movement). Dotted lines represent misclassified points. (G–I) Time-lapse images of the AI-assisted human-interactive robotic control using the M-Bot. Scale bars, 5 cm. (J) Response of the LPS when the robot approaches and leaves an object. (K) Current applied on a participant’s arm during the feedback stimulation.
Fig. 4.
Fig. 4.. Evaluation of the M-Bot in human-interactive robotic physicochemical sensing.
(A–D) Time-lapse images of the human-interactive robotic control for object grasping and on-site TNT detection. Scale bars, 5 cm. (E–G) The sEMG data collected in real time, which allow the robotic hand to approach and grasp a spherical object (E) and the corresponding tactile (F) and TNT (G) sensor responses. Insets in F and G, colored mapping of pressure and TNT distributions on the object. (H–J) Photograph of the robotic OP sensing on a cylindrical object (H) and the corresponding responses of the tactile sensors (I) and OP sensors (J) on an e-skin-R. Insets in I and J represent the color mappings of pressure and OP distributions on the object. Scale bar, 5 cm.
Fig. 5.
Fig. 5.. Evaluation of the-skin-R in an autonomous multimodal sensing robotic boat (M-Boat).
(A) Schematic of the intelligent M-Boat integrated with a printed multimodal e-skin-R sensor array that can perform temperature and multiplexed chemical sensing for autonomous source tracking. (B and C), Schematic (B) and photograph (C) illustrating the assembly of the M-Boat components. (D) System block diagram of the M-Boat for autonomous propulsion, sensing, and signal processing. ADC, analog-to-digital converter; AFE, analog front-end; BLE, Bluetooth low energy; CPU, central processing unit; GPIO, general-purpose input/output; H-SW, H-bridge software; MUA: multiplexer; PWM, pulse width modulation; SPI, serial peripheral interface. (E) Wireless control of the M-Boat. (F) Simulated distributions of a hazardous chemical (OP) leak and the algorithm used by the M-Boat for autonomous source tracking. C, concentration. (G) Simulation comparison of different search algorithms used by the M-Boat for intelligent source tracking. GD, gradient descent; IM, interpolated map; MD, max direction; RD, random direction. (H and I) Time-lapse images showing example demonstrations of the M-Boat enabled autonomous source tracking. Insets, simulated target (proton from corrosive acid) distributions. Scale bars, 5 cm. (J and K) Time lapse images (J) and the nDPV voltammograms collected at location 1 (K) for autonomous decision-making and chemical threat (OP) source tracking using an M-Boat. Insets, simulated OP distribution. Scale bar, 5 cm.

References

    1. Yang G-Z, Bellingham J, Dupont PE, Fischer P, Floridi L, Full R, Jacobstein N, Kumar V, McNutt M, Merrifield R, Nelson BJ, Scassellati B, Taddeo M, Taylor R, Veloso M, Wang ZL, Wood R, The grand challenges of Science Robotics. Sci. Robot 3, eaar7650 (2018). - PubMed
    1. Sundaram S, Kellnhofer P, Li Y, Zhu J-Y, Torralba A, Matusik W, Learning the signatures of the human grasp using a scalable tactile glove. Nature 569, 698–702 (2019). - PubMed
    1. Cianchetti M, Laschi C, Menciassi A, Dario P, Biomedical applications of soft robotics. Nat. Rev. Mater 3, 143–153 (2018).
    1. Chortos A, Liu J, Bao Z, Pursuing prosthetic electronic skin. Nat. Mater 15, 937–950 (2016). - PubMed
    1. Boutry CM, Negre M, Jorda M, Vardoulis O, Chortos A, Khatib O, Bao Z, A hierarchically patterned, bioinspired e-skin able to detect the direction of applied pressure for robotics. Sci. Robot 3, eaau6914 (2018). - PubMed

Publication types