Kazuhiro Gomi, NTT Research CEO. Leads research in physics & informatics, cryptography & information security, medical & health informatics.

The idea of optical computing—the use of photons instead of electrons to perform computational operations—has been around for decades. However, interest has resurged in recent years; the potential for massive gains in energy efficiency makes it a good fit for the dawning age of AI and machine learning (ML).

Much discussion on this topic is scientific, but there are good reasons for business leaders to keep up with computer processing trends. Consider Nvidia, which began repurposing its graphical processing units (GPUs) from the gaming industry to AI in the mid-2000s. In 2024, it became one of the top five most valuable publicly traded companies in the U.S.

Is there another Nvidia out there? The optical computing market is young but promising. Implementations in the near term will solve big problems with niche requirements. The longer term looks even more exciting.

Photons, Electrons And Neural Networks

Electronic circuitry is such a part of our landscape that it’s hard to imagine an alternative. However, we’re facing limits, from the size of transistors and their clock speed on microchips to bottlenecks in the prevalent computer architecture (with its requirements for constant data transport between memory and processor). The fast-growing demand for massively parallel compute power driven by artificial neural networks exacerbates this situation even further, and the appetite for energy is skyrocketing.

These are all real constraints, but photonic computing could be a solution.

To explain how it works, physicists often point to the eyeglass, which transforms inputs such as scenery or words on a page into an output of corrected vision (say, blurred vision vs. focused). Moreover, this kind of “optical compute” occurs passively without requiring energy.

A more technical account of optical computing’s potential came in an influential 2017 paper by MIT professor Dirk Englund et al., showing how it was possible to use silicon photonics to perform multiply-accumulate (MAC) operations—which are at the heart of artificial neural networks.

Does this mean photons win against electrons? Not quite. With decades of product development, electronic computers are powerful engines that can deliver tremendous throughput. Surpassing them requires properly leveraging a mix of optical features and techniques, according to Cornell professor Peter McMahon. He lists bandwidth, spatial parallelism, one-way propagation, wave physics and other characteristics. (Full disclosure: NTT Research is one of three organizations listed in this paper as providing support to McMahon.)

McMahon also notes that the potential advantages of optical computing do not include one that is commonly invoked: the speed of light (a.k.a. “c”). First, c prevails only in a vacuum. Second, light travels at about 0.4c in silicon photonics, which is about as fast as electrons travel on printed circuit boards.

One challenge for photonics is its linearity. Photons are normally independent, non-interactive entities. (This is why lightsabers only work in the movies.) Silicon photonics are good at linear, point-to-point data transport. Getting optics to perform in a more nonlinear fashion, which is a requisite for computing, has been a focus of recent research and development. This is analogous to transistors and diodes in electronics, which provide switching and amplification capabilities. Millions of transistors are now integrated into a chip to create logical circuits, which is the basis of today’s digital computers. We need something like that in optics.

Components And Markets

One material that has helped photonics acquire nonlinearity is lithium niobate, a crystal (specifically, a synthetic salt) that has been long and widely used for modulating light signals in optical communication systems. It was relaunched in thin-film form in the 2010s, diverging in terms of fabrication and application.

The Thin-Film Lithium Niobate (TFLN) market is already poised for very fast growth through 2029 in the modulator market. In addition to communications and computation, TFLN can be used for other applications such as gas sensing.

Emerging optical computing systems will draw from various components, techniques and features. There’s no single blueprint. Two oft-mentioned startups are Lightmatter, a spinoff from Englund’s lab at MIT, and Luminous, which is out of Princeton University. There is also ongoing research at industrial labs, including NTT. (One of our labs at NTT Research is focused on a combination of photonics and quantum computing.)

Niche Applications

As with quantum computers, optical computers are far from “general purpose” status. If they can handle matrix-vector multipliers in scalable fashion, however, that will open them up to not only neural networks but also applications that deal with combinatorial optimization problems such as those found in material science and logistics.

Other niche applications are emerging. In a 2024 article published in Light: Science & Applications, a Princeton University-led group of scientists demonstrated how photonic processing could very efficiently suppress transmission errors and maintain signal-to-noise ratios in scenarios involving 5G cellular towers and aviation radar altimeters. Another niche app, the direct optical processing of visual scenes, is attractive for privacy purposes, given the greater difficulty of hacking images not in an electronic format.

In The Long Term

Two decades ago, GPUs were starting to supplant CPUs. What does the future look like for optical computing? A general-purpose optical neural network may be years out, according to another McMahon paper, but could end up being 1,000 times as efficient as its electronic counterparts. To get there, expect more basic research.

The broader need, however, is urgent. Moore’s law is fading. The von Neumann computing model (which has separated memory from processing since the 1940s) has inherent limits. The projected growth in AI-driven data consumption is unsustainable.

The industry is already adjusting, for instance, by scaling down large language models (LLMs). However, to break through to another level and continue to address big computing challenges, it helps to think big and in new ways: salt (lithium niobate) instead of sand (silicon). Photons instead of electrons.

Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Share.

Leave A Reply

Exit mobile version