[ad_1]
As request for artificial intelligence grows, as does the thirst for the computing power needed to run AI.
Light, an MIT-born startup, is betting that the voracious hunger for AI will spawn demand for a fundamentally different type of computer chip, a chip that uses light to perform key calculations.
“Either we’re inventing new kinds of computers to keep going,” says Nick Harris, CEO of Lightmatter, “or the AI ​​slows down.”
Conventional computer chips operate by using transistors to control the flow of electrons through a semiconductor. By reducing information to a series of 1s and 0s, these chips can perform a wide range of logic operations and power complex software. Lightmatter’s chip, on the other hand, is designed to perform only a specific type of mathematical computation that is essential to running powerful AI programs.
Harris recently showed the new chip to WIRED at the company’s Boston headquarters. It looked like a regular computer chip with several fiber optic wires sticking out of it. But he performed calculations by splitting and mixing beams of light in tiny channels, measuring only nanometers. An underlying silicon chip orchestrates the operation of the photonic part and also provides temporary memory storage.
Lightmatter plans to start shipping its first light-based AI chip, called Envise, later this year. It will ship blade servers containing 14 chips that fit into conventional data centers. The company raised $ 22 million from GV (formerly Google Ventures), Spark Capital and Matrix Partners.
The company claims its chip works 1.5 to 10 times faster than a high-end chip Nvidia A100 AI chip, depending on the task. Running a natural language model called BERT, for example, Lightmatter says Envise is five times faster than the Nvidia chip; it also consumes one sixth of the energy.
The technology has technical limitations, and it can be difficult to persuade companies to switch to an unproven design. But Rich wawrzyniak, a Semico analyst who has been briefed on the technology, says he believes it has a good chance of gaining ground. “What they showed me – I think it’s pretty good,” he said.
Wawrzyniak expects big tech companies to at least test the technology, as the demand for AI and the cost of its use are rising so rapidly. “This is a pressing issue in many ways,” he said. The energy needs of data centers are “climbing like a rocket”.
Lightmatter’s chip is faster and more efficient for some AI calculations because information can be encoded more efficiently into different wavelengths of light and because light control requires less energy than flow control of electrons with transistors.
A key limitation of the Lightmatter chip is that its calculations are analog rather than digital. This makes it inherently less precise than digital silicon chips, but the company has developed techniques to improve the accuracy of the calculations. Lightmatter will initially market its chips to run pre-trained AI models rather than training models because it requires less precision, but Harris says in principle they can do both.
The chip will be more useful for a type of AI known as deep learning, based on the formation of very large or “deep” neural networks to make sense of data and make useful decisions. The approach has given computers new capabilities in image and video processing, natural language understanding, robotics, and business data understanding. But it requires large amounts of data and computing power.
Training and operating deep neural networks means performing a lot of parallel computations, a task well suited to high-end graphics chips. The rise of deep learning has already inspired a flowering of new chip designs, specialized for data centers at very efficient designs for mobile gadgets and portable devices.
[ad_2]