Overview
- Explains the intricacy and diversity of recurrent networks from simple to more complex gated recurrent neural networks
- Discusses the design framing of such networks, and how to redesign simple RNN to avoid unstable behavior
- Describes the forms of training of RNNs framed in adaptive non-convex optimization with dynamics constraints
Access this book
Tax calculation will be finalised at checkout
Other ways to access
About this book
This textbook provides a compact but comprehensive treatment that provides analytical and design steps to recurrent neural networks from scratch. It provides a treatment of the general recurrent neural networks with principled methods for training that render the (generalized) backpropagation through time (BPTT). This author focuses on the basics and nuances of recurrent neural networks, providing technical and principled treatment of the subject, with a view toward using coding and deep learning computational frameworks, e.g., Python and Tensorflow-Keras. Recurrent neural networks are treated holistically from simple to gated architectures, adopting the technical machinery of adaptive non-convex optimization with dynamic constraints to leverage its systematic power in organizing the learning and training processes. This permits the flow of concepts and techniques that provide grounded support for design and training choices. The author’s approach enables strategic co-trainingof output layers, using supervised learning, and hidden layers, using unsupervised learning, to generate more efficient internal representations and accuracy performance. As a result, readers will be enabled to create designs tailoring proficient procedures for recurrent neural networks in their targeted applications.
Similar content being viewed by others
Keywords
Table of contents (6 chapters)
-
Part I
-
Part II
-
Gated Recurrent Neural Networks: The LSTM RNN
-
Gated Recurrent Neural Networks: The GRU and The MGU RNN
Authors and Affiliations
About the author
He was a Visiting Professor at UC, Berkeley (1983), the California Institute of Technology, Pasadena (1992), and the University of Minnesota, Twin Cities (1993). He joined MSU in 1985 and has been a Professor since1991. He has worked and consulted for several companies including General Motors, Ford, Smith’s Industries, Intersignal, IC Tech Inc., and Clarity LLC. He has authored more than 250 technical papers, and co-edited the textbook (Dynamical Systems Approaches to Nonlinear Problems in Circuits and Systems, (SIAM, 1988). He is a co-inventor of more than 14 patents on adaptive nonlinear signal processing, neural networks, and sensors.
Bibliographic Information
Book Title: Recurrent Neural Networks
Book Subtitle: From Simple to Gated Architectures
Authors: Fathi M. Salem
DOI: https://doi.org/10.1007/978-3-030-89929-5
Publisher: Springer Cham
eBook Packages: Engineering, Engineering (R0)
Copyright Information: The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2022
Hardcover ISBN: 978-3-030-89928-8Published: 04 January 2022
Softcover ISBN: 978-3-030-89931-8Published: 05 January 2023
eBook ISBN: 978-3-030-89929-5Published: 03 January 2022
Edition Number: 1
Number of Pages: XX, 121
Number of Illustrations: 2 b/w illustrations, 24 illustrations in colour
Topics: Circuits and Systems, Signal, Image and Speech Processing, Data Mining and Knowledge Discovery