Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Feb 24;13(1):1044.
doi: 10.1038/s41467-022-28702-0.

Space-efficient optical computing with an integrated chip diffractive neural network

Affiliations

Space-efficient optical computing with an integrated chip diffractive neural network

H H Zhu et al. Nat Commun. .

Abstract

Large-scale, highly integrated and low-power-consuming hardware is becoming progressively more important for realizing optical neural networks (ONNs) capable of advanced optical computing. Traditional experimental implementations need N2 units such as Mach-Zehnder interferometers (MZIs) for an input dimension N to realize typical computing operations (convolutions and matrix multiplication), resulting in limited scalability and consuming excessive power. Here, we propose the integrated diffractive optical network for implementing parallel Fourier transforms, convolution operations and application-specific optical computing using two ultracompact diffractive cells (Fourier transform operation) and only N MZIs. The footprint and energy consumption scales linearly with the input data dimension, instead of the quadratic scaling in the traditional ONN framework. A ~10-fold reduction in both footprint and energy consumption, as well as equal high accuracy with previous MZI-based ONNs was experimentally achieved for computations performed on the MNIST and Fashion-MNIST datasets. The integrated diffractive optical network (IDNN) chip demonstrates a promising avenue towards scalable and low-power-consumption optical computational chips for optical-artificial-intelligence.

PubMed Disclaimer

Conflict of interest statement

The authors declare no competing interests.

Figures

Fig. 1
Fig. 1. Optical integrated diffractive neural networks (IDNNs).
a The multi-layer neural networks. One layer contains three main parts: optical discrete Fourier transform (ODFT) operation, amplitude/phase modulation, and optical inverse discrete Fourier transform (OIDFT) operation. A nonlinear activation function is added between two layers. b IDNN operates on complex-valued inputs using coherent light. There are two matrices based on diffractive cells and a Hadamard product operation raised by phase and amplitude modulation behind the ODFT operation. c Schematics of the experimental device. The device includes four functional parts: (1) input signal preparation; (2) implementing ODFT operation; (3) modulating amplitude/phase in the Fourier domain; (4) implementing OIDFT operation.
Fig. 2
Fig. 2. Optical microscope photos and some simulation results.
a The optical micrograph of the whole chip. b The optical micrograph of the diffractive cell. c Simulated electric field distribution in the slab waveguide region. d Simulated output amplitude distribution along the output waveguide plane. e Simulated normalized intensity of ten channels after ODFT operation. f Simulated normalized retrieved intensity information of ten channels.
Fig. 3
Fig. 3. Image recognition.
a Schematics of the image recognition process. The input test image dimension is n × n, target image dimension is m × m and the output dimension is k =n + m - 1. b Experimental normalized intensity from the output waveguides when the sample sequence [10101] is the same as the target sequence [10101]. c Experimental correlation results for digit image recognition. There are three-digit images (1, 2, 9), and each digit image is composed of 5 × 5 matrix.
Fig. 4
Fig. 4. Iris flower classification using the diffractive neural network.
a Network consists of one layer. The outputs of the layer are the recognition results after the intensity detection. x1 is sepal length, x2 is sepal width, x3 is petal length, and x4 is petal width. y1 is Setosa, y2 is Versicolor, and y3 is Verginica. b Experimental output intensity distribution of the IDNN for a class of flowers as “Setosa” is demonstrated. c Training and testing results of classification. One classification error appears on the testing part. d Confusion matrix in our experiment using 30 different flowers.
Fig. 5
Fig. 5. Handwriting and fashion recognition using the IDNN.
a The network consists of a hidden layer W and an output layer Wout. The outputs of the output layer are recognition results. The input layer is calculated by the traditional computer and the complex output results of the input layer are converted into amplitude and phase information as the input of the chip. b The numerical testing results of accuracy and loss versus epoch number for the MNIST dataset. c The confusion matrix for our experimental results, using 500 different handwritten digits. d The output intensity distribution of the IDNN for a handwritten input of “2” is demonstrated. e The numerical testing results of accuracy and loss versus epoch number for the MNIST-Fashion dataset. f The confusion matrix in the experiment. g As an example, the output intensity distribution of the IDNN for a fashion product input of “pullover” is demonstrated.

References

    1. Kitayama K, et al. Novel frontier of photonics for data processing—Photonic accelerator. APL Photonics. 2019;4:090901.
    1. Caulfield HJ, Kinser J, Rogers SK. Optical neural networks. Proc. IEEE. 1989;77:1573–1583.
    1. O’Brien JL, Furusawa A, Vuckovic J. Photonic quantum technologies. Nat. Photon. 2009;3:687–695.
    1. Kues M, et al. On-chip generation of high-dimensional entangled quantum states and their coherent control. Nature. 2017;546:622–626. - PubMed
    1. Feldmann J, Youngblood N, et al. Parallel convolutional processing using an integrated photonic tensor core. Nature. 2021;589:52–58. - PubMed

Publication types