145 Afleveringen

  1. 24 - Improving Hypernymy Detection with an Integrated Path-based and Distributional Method

    Gepubliceerd: 27-6-2017
  2. 23 - Get To The Point: Summarization with Pointer-Generator Networks

    Gepubliceerd: 26-6-2017
  3. 22 - Deep Multitask Learning for Semantic Dependency Parsing, with Noah Smith

    Gepubliceerd: 16-6-2017
  4. 21 - Contextual Explanation Networks, with Maruan Al-Shedivat

    Gepubliceerd: 15-6-2017
  5. 20 - A simple neural network module for relational reasoning

    Gepubliceerd: 14-6-2017
  6. 19 - End-to-end Differentiable Proving, with Tim Rocktäschel

    Gepubliceerd: 12-6-2017
  7. 18 - Generalizing to Unseen Entities and Entity Pairs with Row-less Universal Schema

    Gepubliceerd: 9-6-2017
  8. 17 - pix2code: Generating Code from a Graphical User Interface Screenshot

    Gepubliceerd: 8-6-2017
  9. 16 - Arc-swift: A Novel Transition System for Dependency Parsing

    Gepubliceerd: 7-6-2017
  10. 15 - Attention and Augmented Recurrent Neural Networks

    Gepubliceerd: 6-6-2017
  11. 14 - Discourse-Based Objectives for Fast Unsupervised Sentence Representation Learning

    Gepubliceerd: 5-6-2017
  12. 13 - Question Answering from Unstructured Text by Retrieval and Comprehension

    Gepubliceerd: 2-6-2017
  13. 12 - Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

    Gepubliceerd: 1-6-2017
  14. 11 - Relation Extraction with Matrix Factorization and Universal Schemas

    Gepubliceerd: 29-5-2017
  15. 10 - A Syntactic Neural Model for General-Purpose Code Generation

    Gepubliceerd: 26-5-2017
  16. 09 - Learning to Generate Reviews and Discovering Sentiment

    Gepubliceerd: 25-5-2017
  17. 08 - Finding News Citations for Wikipedia

    Gepubliceerd: 24-5-2017
  18. 07 - Capturing Semantic Similarity for Entity Linking with Convolutional Neural Networks

    Gepubliceerd: 23-5-2017
  19. 06 - Design Challenges for Entity Linking

    Gepubliceerd: 22-5-2017
  20. 05 - Transition-Based Dependency Parsing with Stack Long Short-Term Memory

    Gepubliceerd: 19-5-2017

7 / 8

**The podcast is currently on hiatus. For more active NLP content, check out the Holistic Intelligence Podcast linked below.** Welcome to the NLP highlights podcast, where we invite researchers to talk about their work in various areas in natural language processing. All views expressed belong to the hosts/guests, and do not represent their employers.

Visit the podcast's native language site