- 384 pages
- English
- PDF
- Available on iOS & Android
About This Book
Neural Networks for Perception, Volume 2: Computation, Learning, and Architectures explores the computational and adaptation problems related to the use of neuronal systems, and the corresponding hardware architectures capable of implementing neural networks for perception and of coping with the complexity inherent in massively distributed computation. This book addresses both theoretical and practical issues related to the feasibility of both explaining human perception and implementing machine perception in terms of neural network models. The text is organized into two sections. The first section, computation and learning, discusses topics on learning visual behaviors, some of the elementary theory of the basic backpropagation neural network architecture, and computation and learning in the context of neural network capacity. The second section is on hardware architecture. The chapters included in this part of the book describe the architectures and possible applications of recent neurocomputing models. The Cohen-Grossberg model of associative memory, hybrid optical/digital architectures for neorocomputing, and electronic circuits for adaptive synapses are some of the subjects elucidated. Neuroscientists, computer scientists, engineers, and researchers in artificial intelligence will find the book useful.
Frequently asked questions
Information
Table of contents
- Front Cover
- Computation, Learning, and Architectures
- Copyright Page
- Table of Contents
- Dedication
- Contents of Volume 1
- Contributors
- Foreword
- PART III: Computation and Learning
- PART IV: Architectures
- Index