Artificial intelligence in drug discovery: what is realistic, what are illusions? Part 2: a discussion of chemical and biological data
- PMID: 33508423
- PMCID: PMC8132984
- DOI: 10.1016/j.drudis.2020.11.037
Artificial intelligence in drug discovery: what is realistic, what are illusions? Part 2: a discussion of chemical and biological data
Abstract
'Artificial Intelligence' (AI) has recently had a profound impact on areas such as image and speech recognition, and this progress has already translated into practical applications. However, in the drug discovery field, such advances remains scarce, and one of the reasons is intrinsic to the data used. In this review, we discuss aspects of, and differences in, data from different domains, namely the image, speech, chemical, and biological domains, the amounts of data available, and how relevant they are to drug discovery. Improvements in the future are needed with respect to our understanding of biological systems, and the subsequent generation of practically relevant data in sufficient quantities, to truly advance the field of AI in drug discovery, to enable the discovery of novel chemistry, with novel modes of action, which shows desirable efficacy and safety in the clinic.
Copyright © 2021 The Authors. Published by Elsevier Ltd.. All rights reserved.
Figures
References
-
- Ciresan D.C. Deep big simple neural nets for handwritten digit recognition. Neural Comput. 2010;22:3207–3220. - PubMed
-
- Krizhevsky A. ImageNet classification with deep convolutional neural networks. In: Pereira F., editor. NIPS’12: Proceedings of the 25th International Conference on Neural Information Processing Systems. Curran Associates; 2012. pp. 1097–1105.
-
- Srivastava N. Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014;15:1929–1958.
-
- Deng J. 2009 IEEE Conference on Computer Vision and Pattern Recognition. IEEE; 2009. ImageNet: a large-scale hierarchical image database; pp. 248–255.
-
- Hochreiter S. Gradient flow in recurrent nets: the difficulty of learning long-term dependencies. In: Kremer S.C., Kolen J.F., editors. A Field Guide to Dynamical Recurrent Neural Networks. IEEE Press; 2001. pp. XXX–YYY.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
