MARC details
000 -LEADER |
fixed length control field |
05630nam a2200241Ia 4500 |
003 - CONTROL NUMBER IDENTIFIER |
control field |
NULRC |
005 - DATE AND TIME OF LATEST TRANSACTION |
control field |
20250520102828.0 |
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION |
fixed length control field |
250520s9999 xx 000 0 und d |
020 ## - INTERNATIONAL STANDARD BOOK NUMBER |
International Standard Book Number |
9789814522731 |
040 ## - CATALOGING SOURCE |
Transcribing agency |
NULRC |
050 ## - LIBRARY OF CONGRESS CALL NUMBER |
Classification number |
QA 76.87 .G73 2013 |
100 ## - MAIN ENTRY--PERSONAL NAME |
Personal name |
Graupe, Daniel |
Relator term |
author |
245 #0 - TITLE STATEMENT |
Title |
Principles of artificial neural networks / |
Statement of responsibility, etc. |
Daniel Graupe. |
250 ## - EDITION STATEMENT |
Edition statement |
3rd edition |
260 ## - PUBLICATION, DISTRIBUTION, ETC. |
Place of publication, distribution, etc. |
[New Jersey] : |
Name of publisher, distributor, etc. |
World Scientific Publishing Company, |
Date of publication, distribution, etc. |
c2013 |
300 ## - PHYSICAL DESCRIPTION |
Extent |
xviii, 363 pages : |
Other physical details |
illustrations ; |
Dimensions |
26 cm. |
365 ## - TRADE PRICE |
Price amount |
USD170.97 |
504 ## - BIBLIOGRAPHY, ETC. NOTE |
Bibliography, etc. note |
Includes bibliographical references (pages 349-356) and index. |
505 ## - FORMATTED CONTENTS NOTE |
Formatted contents note |
Ch. 1. Introduction and role of artificial neural networks -- ch. 2. Fundamentals of biological neural networks -- ch. 3. Basic principles of ANNs and their early structures. 3.1. Basic principles of ANN design. 3.2. Basic network structures. 3.3. The Perceptron's input-output principles. 3.4. The Adaline (ALC) -- ch. 4. The Perceptron. 4.1. The basic structure. 4.2. The single-layer representation problem. 4.3. The limitations of the single-layer Perceptron. 4.4. Many-layer Perceptrons. 4.A. Perceptron case study: identifying autoregressive parameters of a signal (AR time series identification) -- ch. 5. The Madaline. 5.1. Madaline training. 5.A. Madaline case study: character recognition -- ch. 6. Back propagation. 6.1. The back propagation learning procedure. 6.2. Derivation of the BP algorithm. 6.3. Modified BP algorithms. 6.A. Back propagation case study: character recognition. 6.B. Back propagation case study: the exclusive-OR (XOR) problem (2-layer BP). 6.C. Back propagation case study: the XOR problem -- 3 layer BP network. 6.D. Average monthly high and low temperature prediction using backpropagation neural networks -- ch. 7. Hopfield networks. 7.1. Introduction. 7.2. Binary Hopfield networks. 7.3. Setting of weights in Hopfield nets -- bidirectional associative memory (BAM) principle. 7.4. Walsh functions. 7.5. Network stability. 7.6. Summary of the procedure for implementing the Hopfield network. 7.7. Continuous Hopfield models. 7.8. The continuous energy (Lyapunov) function. 7.A. Hopfield network case study: character recognition. 7.B. Hopfield network case study: traveling salesman problem. 7.C. Cell shape detection using neural networks -- ch. 8. Counter propagation. 8.1. Introduction. 8.2. Kohonen self-organizing map (SOM) layer. 8.3. Grossberg layer. 8.4. Training of the Kohonen layer. 8.5. Training of Grossberg layers. 8.6. The combined counter propagation network. 8.A. Counter propagation network case study: character recognition. Ch. 9. Large scale memory storage and retrieval (LAMSTAR) network. 9.1. Motivation. 9.2. Basic principles of the LAMSTAR neural network. 9.3. Detailed outline of the LAMSTAR network. 9.4. Forgetting feature. 9.5. Training vs. operational runs. 9.6. Operation in face of missing data. 9.7. Advanced data analysis capabilities. 9.8. Modified version: normalized weights. 9.9. Concluding comments and discussion of applicability. 9.A. LAMSTAR network case study: character recognition. 9.B. Application to medical diagnosis problems. 9.C. Predicting price movement in market microstructure via LAMSTAR. 9.D. Constellation recognition -- ch. 10. Adaptive resonance theory. 10.1. Motivation. 10.2. The ART network structure. 10.3. Setting-up of the ART network. 10.4. Network operation. 10.5. Properties of ART. 10.6. Discussion and general comments on ART-I and ART-II. 10.A. ART-I network case study: character recognition. 10.B. ART-I case study: speech recognition -- ch. 11. The cognitron and the neocognitron. 11.1. Background of the cognitron. 11.2. The basic principles of the cognitron. 11.3. Network operation. 11.4. Cognitron's network training. 11.5. The neocognitron -- ch. 12. Statistical training. 12.1. Fundamental philosophy. 12.2. Annealing methods. 12.3. Simulated annealing by Boltzman training of weights. 12.4. Stochastic determination of magnitude of weight change. 12.5. Temperature-equivalent setting. 12.6. Cauchy training of neural network. 12.A. Statistical training case study: a stochastic Hopfield network for character recognition. 12.B. Statistical training case study: Identifying AR signal parameters with a stochastic Perceptron model -- ch. 13. Recurrent (time cycling) back propagation networks. 13.1. Recurrent/discrete time networks. 13.2. Fully recurrent networks. 13.3. Continuously recurrent back propagation networks. 13.A. Recurrent back propagation case study: character recognition. |
520 ## - SUMMARY, ETC. |
Summary, etc. |
Artificial neural networks are most suitable for solving problems that are complex, ill-defined, highly nonlinear, of many and different variables, and/or stochastic. Such problems are abundant in medicine, in finance, in security and beyond. This volume covers the basic theory and architecture of the major artificial neural networks. Uniquely, it presents 18 complete case studies of applications of neural networks in various fields, ranging from cell-shape classification to micro-trading in finance and to constellation recognition - all with their respective source codes. These case studies demonstrate to the readers in detail how such case studies are designed and executed and how their specific results are obtained. The book is written for a one-semester graduate or senior-level undergraduate course on artificial neural networks. It is also intended to be a self-study and a reference text for scientists, engineers and for researchers in medicine, finance and data mining. |
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM |
Topical term or geographic name entry element |
NEURAL NETWORKS (COMPUTER SCIENCE) |
942 ## - ADDED ENTRY ELEMENTS (KOHA) |
Source of classification or shelving scheme |
Library of Congress Classification |
Koha item type |
Books |