Information-Theoretic Aspects of Neural Networks
Author | : | |
Rating | : | 4.97 (774 Votes) |
Asin | : | 0849331986 |
Format Type | : | paperback |
Number of Pages | : | 416 Pages |
Publish Date | : | 2016-01-10 |
Language | : | English |
DESCRIPTION:
Information theoretics vis-a-vis neural networks generally embodies parametric entities and conceptual bases pertinent to memory considerations and information storage, information-theoretic based cost-functions, and neurocybernetics and self-organization. Existing studies only sparsely cover the entropy and/or cybernetic aspects of neural information.Information-Theoretic Aspects of Neural Networks cohesively explores this burgeoning discipline, covering topics such as:Shannon information and information dynamicsneural complexity as an information processing systemmemory and information storage in the interconnected neural webextremum (maximum and minimum) information entropyneural network trainingnon-conventional, statistical distance-measures for neural network optimizationssymmetric and asymmetric characteristics of information-theoretic error-metricsalgorithmic c
A good book dealing with stochastical neural networks. Karthik Sundaram This book has dealt with in depth the various entropy error measures that can be used for the neural networks. The Minimum entropy error measures and the Maximum entropy error functions like the Kapur measures, Sharma-Mittal error measures are discussed as an alternative to the classical square error methods when the input and the teacher values are stochastical variables. The book also has good introduction and is well written. The book has experimental data to support the claims made by the author.