博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
人工智能资料库:第27辑(20170208)
阅读量:2440 次
发布时间:2019-05-10

本文共 3338 字,大约阅读时间需要 11 分钟。


  1. 【博客】DeepMind’s PathNet: A Modular Deep Learning Architecture for AGI

简介:

is a new Modular Deep Learning (DL) architecture, brought to you by who else but DeepMind, that highlights the latest trend in DL research to meld,and Reinforcement Learning into a solution that leads to more capable DL systems. A January 20th, 2017 submitted Arxiv paper “PathNet: Evolution Channels Gradient Descent in Super Neural Networks” (Fernando et. al) has in its abstract the following interesting description of the work:

原文链接:


2.【博客】On the intuition behind deep learning & GANs—towards a fundamental understanding

简介:

A generative adversarial network (GAN) is composed of two separate networks - the generator and the discriminator. It poses the unsupervised learning problem as a game between the two. In this post we will see why GANs have so much potential, and frame GANs as a boxing match between two opponents.

原文链接:


3.【代码】nmtpy

简介:

**nmtpy**is a suite of Python tools, primarily based on the starter code provided infor training neural machine translation networks using Theano.

The basic motivation behind forking**dl4mt-tutorial**was to create a framework where it would be easy to implement a new model by just copying and modifying an existing model class (or even inheriting from it and overriding some of its methods).

原文链接:


4.【博客】Demystifying Word2Vec

简介:

Research into word embeddings is one of the most interesting in the deep learning world at the moment, even though they were introduced as early as 2003 by Bengio, et al. Most prominently among these new techniques has been a group of related algorithm commonly referred to as Word2Vec which came out of google research.[^2]

In this post we are going to investigate the significance of Word2Vec for NLP research going forward and how it relates and compares to prior art in the field. In particular we are going to examine some desired properties of word embeddings and the shortcomings of other popular approaches centered around the concept of a Bag of Words (henceforth referred to simply as Bow) such as Latent Semantic Analysis. This shall motivate a detailed exposition of how and why Word2Vec works and whether the word embeddings derived from this method can remedy some of the shortcomings of BoW based approaches. Word2Vec and the concept of word embeddings originate in the domain of NLP, however as we shall see the idea of words in the context of a sentence or a surrounding word window can be generalized to any problem domain dealing with sequences or sets of related data points.

原文链接:


5.【博客】Highlights and tutorials for concepts discussed in “Richard Socher on the future of deep learning”

简介:

Bruner, Jon. “The O’Reilly Bots Podcast” Audio blog post. Richard Socher on the Future of Deep Learning. O’Reilly, December 1, 2016.

: I highly encourage listening to the podcast because the questions were so well crafted.

TLDR; Richard Socher of Salesforce (formerly Stanford and MetaMind) offers insight into the current and future states of deep learning used for NLP. We need one model that can do lot’s of different tasks and need to be wary of bias in our models. Future of conversational bots is multimodal and Salesforce research is awesome.

Disclaimer: This is my interpretation of the interview. I have included the pertinent questions I found interesting.

原文链接:


转载地址:http://cvdqb.baihongyu.com/

你可能感兴趣的文章
macos 安装mysql_如何在macOS上安装MySQL
查看>>
javascript运算符_JavaScript赋值运算符
查看>>
在Vue.js中使用Tailwind
查看>>
express静态服务_使用Express服务静态资产
查看>>
Vue.js计算属性
查看>>
叶节点到根节点的路径_节点路径模块
查看>>
express json_使用Express发送JSON响应
查看>>
vue.js 组件css_使用CSS样式化Vue.js组件
查看>>
Node os模块
查看>>
如何在Svelte模板中添加评论
查看>>
前端测试简介
查看>>
如何查找公共子字符串长度_如何在C中查找字符串的长度
查看>>
如何从Node.js程序退出
查看>>
Number toString()方法
查看>>
javascript运算符_JavaScript比较运算符
查看>>
BroadcastChannel API
查看>>
css float属性_CSS float属性和清除
查看>>
字符串tostring_字符串toString()方法
查看>>
字符串方法中会修改原字符串_字符串padStart()方法
查看>>
字符串中include_字符串include()方法
查看>>