Apple’s AI Director: Here’s How to Supercharge Deep Learning

Ruslan Salakhutdinov, who leads Apple’s AI efforts, says emerging techniques could make the most popular approach in the field far more powerful.

by Will Knight  March 29, 2017

Apple’s director of artificial intelligence, Ruslan Salakhutdinov, believes that the deep neural networks that have produced spectacular results in recent years could be supercharged in coming years by the addition of memory, attention, and general knowledge.


Speaking at MIT Technology Review’s EmTech Digital conference in San Francisco on Tuesday, Salakhutdinov said these attributes could help solve some of the outstanding problems in artificial intelligence.


Salakhutdinov, who retains a post as an associate professor at Carnegie Mellon University in Pittsburgh, pointed in his talk to limitations with deep-learning-driven machine vision and natural-language understanding.


Deep learning—a technique that involves using vast numbers of roughly simulated neurons arranged in many interconnected layers—has produced dramatic progress in machine perception over recent years, but there are many ways in which these networks are limited.


Salakhutdinov showed, for example, how image captioning systems based on the technology can label images incorrectly because they tend to focus on everything in the image. He then pointed to a solution in the form of so-called “attention mechanisms,” a tweak to deep learning that has been developed in the last few years. The approach can remedy these errors by having a system focus on specific parts of an image when applying different words in a caption. The same approach can help improve natural-language understanding, too, by enabling a machine to focus on the relevant part of a sentence in order to infer its meaning.


A technique called memory networks, developed by researchers at Facebook, can improve how machines talk with people. As the name suggests, the approach adds a component of long-term memory to neural networks so that they remember the history of a chat.


Memory networks have been shown to improve another kind of AI as well, known as reinforcement learning. For example, two researchers at CMU recently showed how this could create a smarter game-playing algorithm. Researchers at DeepMind, an AI-focused subsidiary of Alphabet, have also demonstrated ways for deep-learning systems to build and access a form of memory.


Reinforcement learning is rapidly emerging as a valuable way to solve hard-to-program problems in robotics and automated driving. It was one of MIT Technology Review’s 10 Breakthrough Technologies of 2017.


Another exciting area of future research, Salakhutdinov said, would be finding ways to combine hand-built sources of knowledge with deep learning. He pointed to general-knowledge databases like Freebase and word-meaning repositories like WordNet.


Just as humans rely heavily on general knowledge when parsing language or interpreting a visual scene, this could help make AI systems smarter, Salakhutdinov said. “How can we incorporate all that prior knowledge into deep learning?” he said during his talk. “That’s a big challenge.”


Salakhutdinov spoke during a session that brought together researchers from several different schools of AI. A common theme among the speakers was the need for different approaches in order to take AI to the next level.


During the session Pedro Domingos, a professor at the University of Washington who studies different machine-learning approaches, said there is also a need to keep searching for completely new approaches to AI. “There’s a school of thought in machine learning that we don’t need fancy new algorithms, we just need more data,” he said. “I think there are really deep, fundamental ideas that need to be discovered before we can really solve AI.”

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容

  • URL types CFBundleURLTypesURL identifier CFBundleURLNa...
    我是小胡胡123阅读 4,876评论 0 0
  • 这个词,2017年就听过,文章,2017年就看过,但是,那时候的自己,没有用心领悟,各种无头苍蝇的事情在做,这次,...
    9837fd914689阅读 2,972评论 0 0
  • 课程大纲 考试流程 托福写作有两个板块:(没有大小重要之分,两个都是15分) 综合写作:阅读+听力+写(150-2...
    Babus阅读 4,258评论 0 2