Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 Jan:133:103-111.
doi: 10.1016/j.neunet.2020.10.011. Epub 2020 Oct 22.

Unsupervised feature learning for self-tuning neural networks

Affiliations

Unsupervised feature learning for self-tuning neural networks

Jongbin Ryu et al. Neural Netw. 2021 Jan.

Abstract

In recent years transfer learning has attracted much attention due to its ability to adapt a well-trained model from one domain to another. Fine-tuning is one of the most widely-used methods which exploit a small set of labeled data in the target domain for adapting the network. Including a few methods using the labeled data in the source domain, most transfer learning methods require labeled datasets, and it restricts the use of transfer learning to new domains. In this paper, we propose a fully unsupervised self-tuning algorithm for learning visual features in different domains. The proposed method updates a pre-trained model by minimizing the triplet loss function using only unlabeled data in the target domain. First, we propose the relevance measure for unlabeled data by the bagged clustering method. Then triplets of the anchor, positive, and negative data points are sampled based on the ranking violations of the relevance scores and the Euclidean distances in the embedded feature space. This fully unsupervised self-tuning algorithm improves the performance of the network significantly. We extensively evaluate the proposed algorithm using various metrics, including classification accuracy, feature analysis, and clustering quality, on five benchmark datasets in different domains. Besides, we demonstrate that applying the self-tuning method on the fine-tuned network help achieve better results.

Keywords: Bagged clustering; Ranking violation for triplet sampling; Self-tuning neural network; Unsupervised feature learning; Unsupervised transfer learning.

PubMed Disclaimer

Conflict of interest statement

Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

LinkOut - more resources