Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2024 Oct 21;32(22):39160-39176.
doi: 10.1364/OE.531165.

All-optical multi-wavelength-channel ReLU activation function

Free article

All-optical multi-wavelength-channel ReLU activation function

Mohammad Mehdi Dehghani et al. Opt Express. .
Free article

Abstract

Optical neural networks (ONNs) are custom optical circuits promising a breakthrough in low-power, parallelized, and high-speed hardware, for the growing demands of artificial intelligence applications. All-optical implementation of ONNs has proven burdensome chiefly due to the lack of optical devices that can emulate the neurons' non-linear activation function, thus forcing hybrid optical-electronic implementations. Moreover, ONNs suffer from a large footprint in comparison to their electronic (CMOS-based) counterparts. Utilizing virtual optical neurons in time or frequency domain can reduce the number of required physical neurons, but an all-optical activation function is still required, especially where several layers comprised of multiple neurons are required for deep networks. Here we propose an all-optical multi-wavelength-channel rectified linear unit (ReLU) activation function, by leveraging χ(2) nonlinearity across more than 100 wavelength channels simultaneously. Our design significantly reduces the footprint of ONNs by consolidating all of the nonlinear activation functions present in each layer of an ONN into a single physical device with a broad bandwidth. This enables the realization of all-optical low-footprint ONNs with multiple layers made of several virtual neurons whose outputs are computed by a single ReLU activation function. We demonstrate this by simulating a 16-channel ReLU function in a realistic ONN and performing a multi-class classification task with a validation accuracy of 98.05%.

PubMed Disclaimer

LinkOut - more resources