Overview
1 Introduction 1.1 Deep Learning Background 1.2 Structure-Sensitive Neural Networks 1.3 The Proposed Tree-Based Convolutional Neural Networks 1.4 Overview of the Book 2 Preliminaries and Related Work 2.1 General Neural Networks 2.1.1 Neurons and Multi-Layer Perceptrons 2.1.2 Training of Neural Networks: Backpropagations 2.1.3 Pros and Cons of Multi-Layer Perceptrons 2.1.4 Pretraining of Neural Networks 2.2 Neural Networks Applied in Natural Language Processing 2.2.1 The Characteristics of Natural Language 2.2.2 Language Models 2.2.3 Word Embeddings 2.3 Existing Structure-Sensitive Neural Networks 2.3.1 Convolutional Neural Networks 2.3.2 Recurrent Neural Networks 2.3.3 Recursive Neural Networks 2.4 Summary and Discussions 3 General Concepts of Tree-Based Convolutional Neural Networks (TBCNNs) 3.1 Idea and Formulation 3.2 Applications of TBCNNs 3.3 Issues in designing TBCNNs 4 TBCNN for Programs' Abstract Syntax Trees (ASTs) 4.1 Background of Program Analysis 4.2 Proposed Model 4.2.1 Overview 4.2.2 Representation Learning of AST nodes 4.2.3 Encoding Layer 4.2.4 AST-Based Convolutional Layer 4.2.5 Dynamic Pooling 4.2.6 Continuous Binary Tree 4.3 Experiments 4.3.1 Unsupervised Representation Learning 4.3.2 Program Classification 4.3.3 Detecting Bubble Sort 4.3.4 Model Analysis 4.4 Summary and Discussions 5 TBCNN for Constituency Trees in Natural Language Processing 5.1 Background of Sentence Modeling and Constituency Trees 5.2 Proposed Model 5.2.1 Constituency Trees as Input 5.2.2 Recursively Representing Intermediate Layers 5.2.3 Constituency Tree-Based Convolutional Layer 5.2.4 Dynamic Pooling Layer 5.3 &
This item is Non-Returnable
Customers Also Bought
Details
- ISBN-13: 9789811318696
- ISBN-10: 9811318697
- Publisher: Springer
- Publish Date: October 2018
- Dimensions: 9.21 x 6.14 x 0.24 inches
- Shipping Weight: 0.38 pounds
- Page Count: 96
Related Categories
