Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2020 Oct 9;23(11):101656.
doi: 10.1016/j.isci.2020.101656. eCollection 2020 Nov 20.

Integrating Machine Learning with Human Knowledge

Affiliations
Review

Integrating Machine Learning with Human Knowledge

Changyu Deng et al. iScience. .

Abstract

Machine learning has been heavily researched and widely used in many disciplines. However, achieving high accuracy requires a large amount of data that is sometimes difficult, expensive, or impractical to obtain. Integrating human knowledge into machine learning can significantly reduce data requirement, increase reliability and robustness of machine learning, and build explainable machine learning systems. This allows leveraging the vast amount of human knowledge and capability of machine learning to achieve functions and performance not available before and will facilitate the interaction between human beings and machine learning systems, making machine learning decisions understandable to humans. This paper gives an overview of the knowledge and its representations that can be integrated into machine learning and the methodology. We cover the fundamentals, current status, and recent progress of the methods, with a focus on popular and new topics. The perspectives on future directions are also discussed.

Keywords: Artificial Intelligence; Computer Science; Human-Centered Computing.

PubMed Disclaimer

Figures

None
Graphical abstract
Figure 1
Figure 1
Illustration of Hard and Soft Parameter Sharing (A) hard parameter sharing. (B) soft parameter sharing. Redrawn from (Ruder, 2017a).
Figure 2
Figure 2
Illustration of Image Augmentation Techniques
Figure 3
Figure 3
Pseudocode of an Active Learning Example Rephrased from (Settles, 2012)
Figure 4
Figure 4
An Illustration of Active Learning: Choosing Data to Inquire for Better Estimation When Labeled Data Are Not Sufficient Data shown are randomly generated from two Gaussian distributions with different means. Drawn based on the concept in (Settles, 2012). (A) Correct labels of the binary classification problem. The line denotes the decision boundary. (B) A model trained by random queries. (C) A model trained by active queries.
Figure 5
Figure 5
Illustration of Traditional Machine Learning and Transfer Learning (A) Tasks in traditional machine learning do not share knowledge. (B) Tasks in transfer learning share knowledge. Target task can reuse the knowledge of source tasks. Drawn based on the concept in (Pan and Yang, 2009).

References

    1. Abdelaziz I., Fokoue A., Hassanzadeh O., Zhang P., Sadoghi M. Large-scale structural and textual similarity-based mining of knowledge graph to predict drug–drug interactions. J. Web Semant. 2017;44:104–117.
    1. Adam-Bourdarios C., Cowan G., Germain C., Guyon I., Kégl B., Rousseau D. NIPS 2014 Workshop on High-Energy Physics and Machine Learning. 2015. The Higgs boson machine learning challenge; pp. 19–55.
    1. Afifi M., Brown M.S. Proceedings of the IEEE International Conference on Computer Vision. IEEE; 2019. What else can fool deep learning? Addressing color constancy errors on deep neural network performance; pp. 243–252.
    1. Aha D.W., Bankert R.L. Learning from Data. Springer; 1996. A comparative evaluation of sequential feature selection algorithms; pp. 199–206.
    1. Amos B., Kolter J.Z. Optnet: differentiable optimization as a layer in neural networks. arXiv. 2017 arXiv:1703.00443

LinkOut - more resources