+DS vLE: The Transformer Network for Natural Language Processing

Sep 10

Thursday, September 10, 2020 - 4:30pm to 6:00pm

Virtual session

Add to calendar »


Lawrence Carin

Neural-network-based methods for natural language processing (NLP) constitute an area of significant recent technical progress, with many interesting real-world applications. The Transformer Network is one of the newest and most powerful approaches of this type. This algorithm is based on repeated application of attention networks, in an encoder-decoder framework. In this presentation the basics of all-attention models (the Transformer) for NLP will be described, with application in areas like text synthesis (e.g., suggesting email text) and language translation. This session is part of the Duke+DataScience (+DS) program virtual learning experiences (vLEs). To learn more, please visit https://plus.datascience.duke.edu