Ranked #1
36 - Attention Is All You Need, with Ashish Vaswani and Jakob Uszkoreit
36 - Attention Is All You Need, with Ashish Vaswani and Jakob Uszkoreit
NIPS 2017 paper.We dig into the details of the Transformer, from the "attention is all you need" paper. Ashish and Jako... Read more
23 Oct 2017
•
41mins
Ranked #2
116 - Grounded Language Understanding, with Yonatan Bisk
116 - Grounded Language Understanding, with Yonatan Bisk
We invited Yonatan Bisk to talk about grounded language understanding. We started off by discussing an overview of the t... Read more
3 Jul 2020
•
59mins
Similar Podcasts
Ranked #3
114 - Behavioral Testing of NLP Models, with Marco Tulio Ribeiro
114 - Behavioral Testing of NLP Models, with Marco Tulio Ribeiro
We invited Marco Tulio Ribeiro, a Senior Researcher at Microsoft, to talk about evaluating NLP models using behavioral t... Read more
26 May 2020
•
43mins
Ranked #4
56 - Deep contextualized word representations, with Matthew Peters
56 - Deep contextualized word representations, with Matthew Peters
NAACL 2018 paper, by Matt Peters, Mark Neumann, Mohit Iyyer, Matt Gardner, Chris Clark, Kenton Lee, and Luke Zettlemoyer... Read more
4 Apr 2018
•
30mins
Ranked #5
113 - Managing Industry Research Teams, with Fernando Pereira
113 - Managing Industry Research Teams, with Fernando Pereira
We invited Fernando Pereira, a VP and Distinguished Engineer at Google, where he leads NLU and ML research, to talk abou... Read more
22 May 2020
•
42mins
Ranked #6
28 - Data Programming: Creating Large Training Sets, Quickly
28 - Data Programming: Creating Large Training Sets, Quickly
NIPS 2016 paper by Alexander Ratner and coauthors in Chris Ré's group at Stanford, presented by Waleed.The paper present... Read more
11 Jul 2017
•
25mins
Ranked #7
15 - Attention and Augmented Recurrent Neural Networks
15 - Attention and Augmented Recurrent Neural Networks
http://distill.pub/2016/augmented-rnns/
6 Jun 2017
•
14mins
Ranked #8
23 - Get To The Point: Summarization with Pointer-Generator Networks
23 - Get To The Point: Summarization with Pointer-Generator Networks
ACL 2017 paper by Abigail See, Peter Liu, and Chris Manning.Matt presents the paper, describing the task (summarization ... Read more
26 Jun 2017
•
17mins
Ranked #9
02 - Bidirectional Attention Flow for Machine Comprehension
02 - Bidirectional Attention Flow for Machine Comprehension
https://www.semanticscholar.org/paper/Bidirectional-Attention-Flow-for-Machine-Seo-Kembhavi/007ab5528b3bd310a80d553cccad... Read more
12 May 2017
•
14mins
Ranked #10
03 - FastQA: A Simple and Efficient Neural Architecture for Question Answering
03 - FastQA: A Simple and Efficient Neural Architecture for Question Answering
https://www.semanticscholar.org/paper/FastQA-A-Simple-and-Efficient-Neural-Architecture-Weissenborn-Wiese/7c1576b96a1e24... Read more
12 May 2017
•
9mins
Ranked #11
26 - Structured Attention Networks, with Yoon Kim
26 - Structured Attention Networks, with Yoon Kim
ICLR 2017 paper, by Yoon Kim, Carl Denton, Luong Hoang, and Sasha Rush.Yoon comes on to talk with us about his paper. T... Read more
30 Jun 2017
•
25mins
Ranked #12
04 - Recurrent Neural Network Grammars, with Chris Dyer
04 - Recurrent Neural Network Grammars, with Chris Dyer
An interview with Chris Dyer.https://www.semanticscholar.org/paper/Recurrent-Neural-Network-Grammars-Dyer-Kuncoro/1594d9... Read more
12 May 2017
•
24mins
Ranked #13
40 - On the State of the Art of Evaluation in Neural Language Models, with Gábor Melis
40 - On the State of the Art of Evaluation in Neural Language Models, with Gábor Melis
Recent arxiv paper by Gábor Melis, Chris Dyer, and Phil Blunsom.Gábor comes on the podcast to tell us about his work. H... Read more
7 Nov 2017
•
29mins
Ranked #14
22 - Deep Multitask Learning for Semantic Dependency Parsing, with Noah Smith
22 - Deep Multitask Learning for Semantic Dependency Parsing, with Noah Smith
An interview with Noah Smith.Noah tells us about his work with his students Hao Peng and Sam Thomson. We talk about wha... Read more
16 Jun 2017
•
31mins
Ranked #15
11 - Relation Extraction with Matrix Factorization and Universal Schemas
11 - Relation Extraction with Matrix Factorization and Universal Schemas
https://www.semanticscholar.org/paper/Relation-Extraction-with-Matrix-Factorization-and-Riedel-Yao/52b5eab895a2d9ae24ea7... Read more
29 May 2017
•
15mins
Ranked #16
20 - A simple neural network module for relational reasoning
20 - A simple neural network module for relational reasoning
The recently-hyped paper that got "superhuman" performance on FAIR's CLEVR dataset.https://arxiv.org/abs/1706.01427
14 Jun 2017
•
17mins
Ranked #17
99 - Evaluating Protein Transfer Learning, With Roshan Rao And Neil Thomas
99 - Evaluating Protein Transfer Learning, With Roshan Rao And Neil Thomas
For this episode, we chatted with Neil Thomas and Roshan Rao about modeling protein sequences and evaluating transfer le... Read more
16 Dec 2019
•
44mins
Ranked #18
95 - Common sense reasoning, with Yejin Choi
95 - Common sense reasoning, with Yejin Choi
In this episode, we invite Yejin Choi to talk about common sense knowledge and reasoning, a growing area in NLP. We sta... Read more
7 Oct 2019
•
35mins
Ranked #19
94 - Decompositional Semantics, with Aaron White
94 - Decompositional Semantics, with Aaron White
In this episode, Aaron White tells us about the decompositional semantics initiative (Decomp), an attempt to re-think th... Read more
30 Sep 2019
•
27mins
Ranked #20
96 - Question Answering as an Annotation Format, with Luke Zettlemoyer
96 - Question Answering as an Annotation Format, with Luke Zettlemoyer
In this episode, we chat with Luke Zettlemoyer about Question Answering as a format for crowdsourcing annotations of var... Read more
12 Nov 2019
•
29mins