Keren Bergman, a professor of electrical engineering and the director of the Lightwave Research Laboratory at Columbia University, in New York City, explains that these feats have become possible only in the last few years because of the maturing of the manufacturing ecosystem for integrated photonics, needed to make photonic chips for communications. But only recently have such optical devices been miniaturized to the point where large numbers of them can be integrated onto a chip and used to perform the matrix multiplications involved in neural-network calculations. This optical device was jointly invented by Ludwig Mach and Ludwig Zehnder in the 1890s. The fundamental component in Lightmatter's chip is a Mach-Zehnder interferometer. That's why he and his colleagues are bent on “developing a new compute technology that doesn't rely on the transistor." “I challenge you to lock a bunch of theorists in a room and have them come up with a better algorithm every 18 months," says Nicholas Harris, CEO of Lightmatter. Deep-learning researchers may develop more efficient algorithms, sure, but it's hard to imagine those gains will be sufficient. There aren't many ways to deal with this problem. But all that may be about to change: Moore's Law may be nearing an end, just as the computing demands of deep learning are exploding. Photo: Lightmatterīecause of the subsequent Moore's Law gains in what could be done with digital electronics, optical computing never really caught on, despite the ascendancy of light as a vehicle for data communications. Plug and Play: Lightmatter's prototype board uses a normal PCI bus. So they processed the data in the analog domain, using light. Engineers regularly resorted to this tactic in the 1960s and '70s, when electronic digital computers were too feeble to perform the complex calculations needed to process synthetic-aperture radar data. While the development of a commercial optical accelerator for deep learning is a remarkable accomplishment, the general idea of computing with light is not new. It will be a refinement of the prototype Mars chip that the company showed off at the virtual Hot Chips conference last August. In particular, one company, Lightmatter, will begin marketing late this year a neural-network accelerator chip that calculates with light. So it's no wonder that engineers and computer scientists are making huge efforts to figure out ways to train and run deep neural networks more efficiently.Īn ambitious new strategy that's coming to the fore this year is to perform many of the required mathematical calculations using photons rather than electrons. Indeed, these neural networks are getting better by the day.īut these advances come at an enormous price in the computing resources and energy they consume. And it is doing that in more and more realms, including natural-language processing, fraud detection, image recognition, and autonomous driving. In 2011, Marc Andreessen, general partner of venture capital firm Andreessen Horowitz, wrote an influential article in The Wall Street Journal titled,“ Why Software Is Eating the World." A decade later now, it's deep learning that's eating the world.ĭeep learning, which is to say artificial neural networks with many hidden layers, is regularly stunning us with solutions to real-world problems.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |