DEEP LEARNING WITH DYNAMIC COMPUTATION GRAPHS

Neural networks that compute over graph structures are a natural fit for problems
in a variety of domains, including natural language (parse trees) and cheminformatics (molecular graphs).

However, since the computation graph has a different shape and size for every input, such networks do not directly support batched training or inference. They are also difficult to implement in popular deep learning libraries, which are based on static data-flow graphs.

We introduce a technique called dynamic batching, which not only batches together operations between different input graphs of dissimilar shape, but also between different nodes within a single input graph. The technique allows us to create static graphs, using popular libraries, that emulate dynamic computation graphs of arbitrary shape and size.

We further present a high-level library1 of compositional blocks that simplifies the creation of dynamic graph models. Using the library, we demonstrate concise and batch-wise parallel implementations for a variety of models from the literature.

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容

  • 闻公子阅读 158评论 0 0
  • 明朗的天 才看得见你的笑脸 此事 经年 你还在我记忆的湖畔 时光会老 淡忘的人事 会过 我把今天典藏 为了明天之后...
    亦柔阅读 307评论 0 0