top of page
Writer's pictureLatitude Design Systems

Photonic Chips Curb AI Training’s Energy Appetite

Stanford team achieves first-ever optical backpropagation milestone.

Microscope images
Microscope images show the bidirectional grating tap [left] used to perform the AI protocol called backpropagation and a large-scale view [right] of this section of the chip. SCIENCE/AAAS

Microscope images show the bidirectional grating tap [left] used to perform the AI protocol called backpropagation and a large-scale view [right] of this section of the chip.

Processors that use light instead of electricity show promise as a faster and more energy-efficient way to implement AI. So far they’ve only been used to run models that have already been trained, but new research has demonstrated the ability to train AI on an optical chip for the first time.

As AI models get ever larger there is growing concern about the amount of energy they consume, both due to ballooning costs and the potential impact on the environment. This is spurring interest in new approaches that can reduce AI’s energy bills, with photonic processors emerging as a leading candidate.

These chips replace the electrons found in conventional processors with photons and use optical components like waveguides, filters, and light detectors to create circuits that can carry out computational tasks. They are particularly promising for running AI because they are very efficient at carrying out matrix multiplications—a key calculation at the heart of all deep-learning models. Companies like Boston-based Lightmatter and Lightelligence in Cambridge, Mass., are already working to commercialize photonic AI chips.

Read More from IEEE Spectrum at:

Comments


bottom of page