Ebook: Neural Networks Theory
- Tags: Appl.Mathematics/Computational Methods of Engineering, Artificial Intelligence (incl. Robotics), Complexity
- Year: 2007
- Publisher: Springer-Verlag Berlin Heidelberg
- Edition: 1
- Language: English
- pdf
"Neural Networks Theory is a major contribution to the neural networks literature. It is a treasure trove that should be mined by the thousands of researchers and practitioners worldwide who have not previously had access to the fruits of Soviet and Russian neural network research. Dr. Galushkin is to be congratulated and thanked for his completion of this monumental work; a book that only he could write. It is a major gift to the world."
Robert Hecht Nielsen, Computational Neurobiology, University of California, San Diego
"Professor Galushkin’s monograph has many unique features that in totality make his work an important contribution to the literature of neural networks theory. He and his publisher deserve profuse thanks and congratulations from all who are seriously interested in the foundations of neural networks theory, its evolution and current status."
Lotfi Zadeh, Berkeley, Founder of Fuzziness
"Professor Galushkin, a leader in neural networks theory in Russia, uses mathematical methods in combination with complexity theory, nonlinear dynamics and optimization, concepts that are solidly grounded in Russian tradition. His theory is expansive: covering not just the traditional topics such as network architecture, it also addresses neural continua in function spaces. I am pleased to see his theory presented in its entirety here, for the first time for many, so that the both theory he developed and the approach he took to understand such complex phenomena can be fully appreciated."
Sun-Ichi Amari, Director of RIKEN Brain Science Institute RIKEN
"Neural Networks Theory is a major contribution to the neural networks literature. It is a treasure trove that should be mined by the thousands of researchers and practitioners worldwide who have not previously had access to the fruits of Soviet and Russian neural network research. Dr. Galushkin is to be congratulated and thanked for his completion of this monumental work; a book that only he could write. It is a major gift to the world."
Robert Hecht Nielsen, Computational Neurobiology, University of California, San Diego
"Professor Galushkin’s monograph has many unique features that in totality make his work an important contribution to the literature of neural networks theory. He and his publisher deserve profuse thanks and congratulations from all who are seriously interested in the foundations of neural networks theory, its evolution and current status."
Lotfi Zadeh, Berkeley, Founder of Fuzziness
"Professor Galushkin, a leader in neural networks theory in Russia, uses mathematical methods in combination with complexity theory, nonlinear dynamics and optimization, concepts that are solidly grounded in Russian tradition. His theory is expansive: covering not just the traditional topics such as network architecture, it also addresses neural continua in function spaces. I am pleased to see his theory presented in its entirety here, for the first time for many, so that the both theory he developed and the approach he took to understand such complex phenomena can be fully appreciated."
Sun-Ichi Amari, Director of RIKEN Brain Science Institute RIKEN
"Neural Networks Theory is a major contribution to the neural networks literature. It is a treasure trove that should be mined by the thousands of researchers and practitioners worldwide who have not previously had access to the fruits of Soviet and Russian neural network research. Dr. Galushkin is to be congratulated and thanked for his completion of this monumental work; a book that only he could write. It is a major gift to the world."
Robert Hecht Nielsen, Computational Neurobiology, University of California, San Diego
"Professor Galushkin’s monograph has many unique features that in totality make his work an important contribution to the literature of neural networks theory. He and his publisher deserve profuse thanks and congratulations from all who are seriously interested in the foundations of neural networks theory, its evolution and current status."
Lotfi Zadeh, Berkeley, Founder of Fuzziness
"Professor Galushkin, a leader in neural networks theory in Russia, uses mathematical methods in combination with complexity theory, nonlinear dynamics and optimization, concepts that are solidly grounded in Russian tradition. His theory is expansive: covering not just the traditional topics such as network architecture, it also addresses neural continua in function spaces. I am pleased to see his theory presented in its entirety here, for the first time for many, so that the both theory he developed and the approach he took to understand such complex phenomena can be fully appreciated."
Sun-Ichi Amari, Director of RIKEN Brain Science Institute RIKEN
Content:
Front Matter....Pages I-XX
Introduction....Pages 1-32
Front Matter....Pages 33-34
Transfer from the Logical Basis of Boolean Elements “AND, OR, NOT” to the Threshold Logical Basis....Pages 35-41
Qualitative Characteristics of Neural Network Architectures....Pages 43-52
Optimization of Cross Connection Multilayer Neural Network Structure....Pages 53-66
Continual Neural Networks....Pages 67-74
Front Matter....Pages 75-75
Investigation of Neural Network Input Signal Characteristics....Pages 77-87
Design of Neural Network Optimal Models....Pages 89-119
Analysis of the Open-Loop Neural Networks....Pages 121-141
Development of Multivariable Function Extremum Search Algorithms....Pages 143-159
Front Matter....Pages 161-161
Neural Network Adjustment Algorithms....Pages 163-187
Adjustment of Continuum Neural Networks....Pages 189-206
Selection of Initial Conditions During Neural Network Adjustment — Typical Neural Network Input Signals....Pages 207-222
Analysis of Closed-Loop Multilayer Neural Networks....Pages 223-272
Synthesis of Multilayer Neural Networks with Flexible Structure....Pages 273-294
Informative Feature Selection in Multilayer Neural Networks....Pages 295-302
Front Matter....Pages 303-303
Neural Network Reliability....Pages 305-319
Neural Network Diagnostics....Pages 321-338
Front Matter....Pages 339-339
Methods of Problem Solving in the Neural Network Logical Basis....Pages 341-376
Back Matter....Pages 377-396
"Neural Networks Theory is a major contribution to the neural networks literature. It is a treasure trove that should be mined by the thousands of researchers and practitioners worldwide who have not previously had access to the fruits of Soviet and Russian neural network research. Dr. Galushkin is to be congratulated and thanked for his completion of this monumental work; a book that only he could write. It is a major gift to the world."
Robert Hecht Nielsen, Computational Neurobiology, University of California, San Diego
"Professor Galushkin’s monograph has many unique features that in totality make his work an important contribution to the literature of neural networks theory. He and his publisher deserve profuse thanks and congratulations from all who are seriously interested in the foundations of neural networks theory, its evolution and current status."
Lotfi Zadeh, Berkeley, Founder of Fuzziness
"Professor Galushkin, a leader in neural networks theory in Russia, uses mathematical methods in combination with complexity theory, nonlinear dynamics and optimization, concepts that are solidly grounded in Russian tradition. His theory is expansive: covering not just the traditional topics such as network architecture, it also addresses neural continua in function spaces. I am pleased to see his theory presented in its entirety here, for the first time for many, so that the both theory he developed and the approach he took to understand such complex phenomena can be fully appreciated."
Sun-Ichi Amari, Director of RIKEN Brain Science Institute RIKEN
Content:
Front Matter....Pages I-XX
Introduction....Pages 1-32
Front Matter....Pages 33-34
Transfer from the Logical Basis of Boolean Elements “AND, OR, NOT” to the Threshold Logical Basis....Pages 35-41
Qualitative Characteristics of Neural Network Architectures....Pages 43-52
Optimization of Cross Connection Multilayer Neural Network Structure....Pages 53-66
Continual Neural Networks....Pages 67-74
Front Matter....Pages 75-75
Investigation of Neural Network Input Signal Characteristics....Pages 77-87
Design of Neural Network Optimal Models....Pages 89-119
Analysis of the Open-Loop Neural Networks....Pages 121-141
Development of Multivariable Function Extremum Search Algorithms....Pages 143-159
Front Matter....Pages 161-161
Neural Network Adjustment Algorithms....Pages 163-187
Adjustment of Continuum Neural Networks....Pages 189-206
Selection of Initial Conditions During Neural Network Adjustment — Typical Neural Network Input Signals....Pages 207-222
Analysis of Closed-Loop Multilayer Neural Networks....Pages 223-272
Synthesis of Multilayer Neural Networks with Flexible Structure....Pages 273-294
Informative Feature Selection in Multilayer Neural Networks....Pages 295-302
Front Matter....Pages 303-303
Neural Network Reliability....Pages 305-319
Neural Network Diagnostics....Pages 321-338
Front Matter....Pages 339-339
Methods of Problem Solving in the Neural Network Logical Basis....Pages 341-376
Back Matter....Pages 377-396
....