Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021;8(1):101.
doi: 10.1186/s40537-021-00492-0. Epub 2021 Jul 19.

Text Data Augmentation for Deep Learning

Affiliations

Text Data Augmentation for Deep Learning

Connor Shorten et al. J Big Data. 2021.

Abstract

Natural Language Processing (NLP) is one of the most captivating applications of Deep Learning. In this survey, we consider how the Data Augmentation training strategy can aid in its development. We begin with the major motifs of Data Augmentation summarized into strengthening local decision boundaries, brute force training, causality and counterfactual examples, and the distinction between meaning and form. We follow these motifs with a concrete list of augmentation frameworks that have been developed for text data. Deep Learning generally struggles with the measurement of generalization and characterization of overfitting. We highlight studies that cover how augmentations can construct test sets for generalization. NLP is at an early stage in applying Data Augmentation compared to Computer Vision. We highlight the key differences and promising ideas that have yet to be tested in NLP. For the sake of practical implementation, we describe tools that facilitate Data Augmentation such as the use of consistency regularization, controllers, and offline and online augmentation pipelines, to preview a few. Finally, we discuss interesting topics around Data Augmentation in NLP such as task-specific augmentations, the use of prior knowledge in self-supervised learning versus Data Augmentation, intersections with transfer and multi-task learning, and ideas for AI-GAs (AI-Generating Algorithms). We hope this paper inspires further research interest in Text Data Augmentation.

Keywords: Big Data; Data Augmentation; NLP; Natural Language Processing; Overfitting; Text Data.

PubMed Disclaimer

Conflict of interest statement

Competing interestsThe authors declare that they have no competing interests.

Figures

Fig. 1
Fig. 1
Success of EDA applied to 5 text classification datasets. A key takeaway from these results is the performance difference with less data. The gain is much more pronounced with 500 labeled examples, compared to 5,000 or the full training set
Fig. 2
Fig. 2
Examples of easy data augmentation transformations
Fig. 3
Fig. 3
Left, word-level mixup. Right, sentence-level mixup. The red outline highlights where augmentation occurs in the processing pipeline
Fig. 4
Fig. 4
Directions for feature space augmentation explored in MODALS
Fig. 5
Fig. 5
Fooled by injected text. Image taken from Jia and Liang [89]
Fig. 6
Fig. 6
Unsupervised data augmentation schema. Image taken from Xie et al. [105]
Fig. 7
Fig. 7
Developing attacks in TextAttack [119]

References

    1. Shorten C, Khoshgoftaar T, Furht B. Deep learning applications for covid-19. J Big Data. 2021 doi: 10.1186/s40537-020-00392-9. - DOI - PMC - PubMed
    1. Tang R, Nogueira R, Zhang E, Gupta N, Cam P, Cho K, Lin J. Rapidly bootstrapping a question answering dataset for covid-19. 2020. arXiv:2004.11339. Accessed Jul 2021
    1. Cachola I, Lo K, Cohan A, Weld DS. TLDR: extreme summarization of scientific documents. 2020. arXiv:2004.15011. Accessed Jul 2021
    1. Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R. Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res. 2014;15(1):1929–58.
    1. Kukačka J, Golkov V, Cremers D. Regularization for deep learning: a taxonomy 2017 . arXiv:1710.10686. Accessed Jul 2021

LinkOut - more resources