Home > Store

Deep Learning for Natural Language Processing LiveLessons (Video Training), 2nd Edition

Deep Learning for Natural Language Processing LiveLessons (Video Training), 2nd Edition

Your browser doesn't support playback of this video. Please download the file to view it.

Online Video

Register your product to gain access to bonus material or receive a coupon.

Description

  • Copyright 2020
  • Edition: 2nd
  • Online Video
  • ISBN-10: 0-13-662004-3
  • ISBN-13: 978-0-13-662004-4

Nearly 5 Hours of Video Instruction

An intuitive introduction to processing natural language data with TensorFlow-Keras deep learning models.

Overview

Deep Learning for Natural Language Processing LiveLessons, Second Edition, is an introduction to building natural language models with deep learning. These lessons bring intuitive explanations of essential theory to life with interactive, hands-on Jupyter notebook demos. Examples feature Python and Keras, the high-level API for TensorFlow 2, the most popular Deep Learning library. In early lessons, specifics of working with natural language data are covered, including how to convert natural language into numerical representations that can be readily processed by machine learning approaches. In later lessons, state-of-the art Deep Learning architectures are leveraged to make predictions with natural language data.
Skill Level

  • Intermediate

Learn How To
  • Preprocess natural language data for use in machine learning applications
  • Transform natural language into numerical representations with word2vec
  • Make predictions with Deep Learning models trained on natural language
  • Apply state-of-the-art NLP approaches with Keras, the high-level API for TensorFlow 2
  • Improve Deep Learning model performance by selecting appropriate model architectures and tuning model hyperparameters

Who Should Take This Course

These LiveLessons are perfectly suited to software engineers, data scientists, analysts, and statisticians with an interest in applying Deep Learning to natural language data. Code examples are provided in Python, so familiarity with it or another object-oriented programming language would be helpful.

Course Requirements

The author’s Deep Learning with TensorFlow, Keras, and PyTorch LiveLessons, or familiarity with the topics covered in Chapters 5 through 9 of his book Deep Learning Illustrated, are a prerequisite.

Lesson Descriptions

Lesson 1: The Power and Elegance of Deep Learning for NLP
This lesson starts off by examining Natural Language Processing and how it has been revolutionized in recent years by Deep Learning approaches. Next comes a review of how to run the code in these LiveLessons. This is followed by the foundational Deep Learning theory that is essential for building an NLP specialization upon. Finally, the lesson provides you with a sneak peek at the capabilities you’ll develop over the course of all five lessons.

Lesson 2: Word Vectors
The lesson begins with a little linguistics section that introduces computational representations of natural language elements. Then it turns to illustrating what word vectors are as well as how the beautiful word2vec algorithm creates them.

Lesson 3: Modeling Natural Language Data
In the preceding lesson, you learned about vector-space embeddings and creating word vectors with word2vec. That process identified shortcomings of our natural language data, so this lesson begins with coverage of best practices for preprocessing language data. Next, on the whiteboard, Jon works through how to calculate a concise and broadly useful summary metric called the Area Under the Curve of the Receiver Operator Characteristic. You immediately learn how to calculate that summary metric in practice by building and evaluating a dense neural network for classifying documents. The lesson then goes a step further by showing you how to add convolutional layers into your deep neural network as well.

Lesson 4: Recurrent Neural Networks
This lesson kicks off by delving into the essential theory of Recurrent Neural Networks, a Deep Learning family that’s ideally suited to handling data that occur in a sequence like languages do. You immediately learn how to apply this theory by incorporating an RNN into your document classification model. Jon then provides a high-level theoretical overview of especially powerful RNN variants--the Long Short-Term Memory Unit and the Gated Recurrent Unit--before showing you how to incorporate these variants into your deep learning models as well.

Lesson 5: Advanced Models
This lesson expands your natural language modeling capabilities further by examining special cases of the LSTM, namely the Bi-Directional and Stacked varieties. Jon also arms you with a rich set of natural language data sets that you can use to train powerful Deep Learning models. To wrap up these LiveLessons, Jon takes you on a journey through other advanced approaches, including sequence generation, seq2seq models, attention, transfer learning, non-sequential network architectures, and financial time series applications.

About Pearson Video Training

Pearson publishes expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. These professional and personal technology videos feature world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, Pearson IT Certification, Prentice Hall, Sams, and Que Topics include: IT Certification, Network Security, Cisco Technology, Programming, Web Development, Mobile Development, and more.  Learn more about Pearson Video training at http://www.informit.com/video.

Video Lessons are available for download for offline viewing within the streaming format. Look for the green arrow in each lesson.

Sample Content

Table of Contents

Introduction
Lesson 1: The Power and Elegance of Deep Learning for NLP
Topics
1.1 Introduction to Deep Learning for Natural Language Processing
1.2 Running the Hands-On Code Examples in Jupyter Notebooks
1.3 Review of Prerequisite Deep Learning Theory
1.4 A Sneak Peek

Lesson 2: Word Vectors
Topics
2.1 Computational Representations of Natural Language Elements
2.2 Visualizing Word Vectors with word2viz
2.3 Localist Versus Distributed Representations
2.4 Elements of Natural Human Language
2.5 The word2vec Algorithm
2.6 Creating Word Vectors with word2vec
2.7 Pre-Trained Word Vectors and doc2vec

Lesson 3: Modeling Natural Language Data
Topics
3.1 Best Practices for Preprocessing Natural Language Data
3.2 The Area Under the ROC Curve
3.4 Document Classification with a Dense Neural Net
3.5 Classification with a Convolutional Neural Net

Lessons 4: Recurrent Neural Networks
Topics
4.1 Essential Theory of RNNs
4.2 RNNs in Practice
4.3 Essential Theory of LSTMs and GRUs
4.4 LSTMs and GRUs in Practice

Lesson 5: Advanced Models
Topics
5.1 Bi-Directional LSTMs
5.2 Stacked LSTMs
5.3 Datasets for NLP
5.4 Sequence Generation
5.5 seq2seq and Attention
5.6 Transfer Learning in NLP: BERT, ELMo, GPT-2 and Other Characters
5.7 Non-Sequential Architectures: The Keras Functional API
5.8 (Financial) Time Series Applications
     
Summary 

Updates

Submit Errata

More Information

Unlimited one-month access with your purchase
Free Safari Membership