Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Apr:172:106127.
doi: 10.1016/j.neunet.2024.106127. Epub 2024 Jan 12.

PeNet: A feature excitation learning approach to advertisement click-through rate prediction

Affiliations

PeNet: A feature excitation learning approach to advertisement click-through rate prediction

Yunfei Yin et al. Neural Netw. 2024 Apr.

Abstract

Since the physical meaning of the fields of the dataset is unknown, we have to use the feature interaction method to select the correlated features and exclude uncorrelated features. The current state-of-the-art methods employ various methods based on feature interaction to predict advertisement Click-Through Rate (CTR); however, the feature interaction based on potential new feature mining is rarely considered, which can provide effective assistance for feature interaction. This motivates us to investigate methods that combine potential new features and feature interactions. Thus, we propose a potential feature excitation learning network (PeNet), which is a neural network model based on feature combination and feature interaction. In PeNet, we treat the row compression and column compression of the original feature matrix as potential new features, and proposed the excitation learning mechanism that is a weighted mechanism based on residual principle. Through this excitation learning mechanism, the original embedded features and potential new features are subjected to weighted interaction based on the residual principle. Moreover, a deep neural network is exploited to iteratively learn and iteratively combine features. The excitation learning structure of PeNet neural network is well demonstrated in this paper, that is, the control flow of embedding, compression, excitation and output, which further strengthens the correlated features and weakens the uncorrelated features by compressing and expanding the features. Experimental results on multiple benchmark datasets indicate the PeNet as a general-purpose plug-in has more superior performance and better efficiency than previous state-of-the-art methods.

Keywords: CTR prediction; Embedding representation; Excitation learning; Feature interaction; Potential features.

PubMed Disclaimer

Conflict of interest statement

Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

LinkOut - more resources