Neural Network Learning: Theoretical Foundations (Paperback)
Martin Anthony
- 出版商: Cambridge
- 出版日期: 2009-08-20
- 售價: $1,980
- 貴賓價: 9.5 折 $1,881
- 語言: 英文
- 頁數: 404
- 裝訂: Paperback
- ISBN: 052111862X
- ISBN-13: 9780521118620
立即出貨 (庫存 < 3)
買這商品的人也買了...
-
$299Python Power!: The Comprehensive Guide
-
$1,710$1,625 -
$3,700$3,515 -
$1,600$1,520 -
$840Interactive Data Visualization for the Web (Paperback)
-
$1,680An Introduction to Statistical Learning: With Applications in R (Hardcover)
-
$2,800$2,660 -
$1,030$979 -
$505Xcode 實戰:Apple 平臺開發實用技術、技巧及最佳流程
-
$1,672$1,584 -
$2,500$2,375 -
$891Invent Your Own Computer Games with Python, 4/e (Paperback)
-
$1,740$1,653 -
$1,530$1,454 -
$1,332Think Like a Data Scientist: Tackle the data science process step-by-step
-
$301機器人系統設計與製作 : Python 語言實現
-
$390$371 -
$1,840$1,748 -
$450$356 -
$301機器人操作系統ROS原理與應用
-
$414$393 -
$500ROS 進階實例
-
$505ROS 機器人開發實踐
-
$520$442 -
$1,060$1,007
商品描述
This important work describes recent theoretical advances in the study of artificial neural networks. It explores probabilistic models of supervised learning problems, and addresses the key statistical and computational questions. Chapters survey research on pattern classification with binary-output networks, including a discussion of the relevance of the Vapnik Chervonenkis dimension, and of estimates of the dimension for several neural network models. In addition, Anthony and Bartlett develop a model of classification by real-output networks, and demonstrate the usefulness of classification with a "large margin." The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction. Key chapters also discuss the computational complexity of neural network learning, describing a variety of hardness results, and outlining two efficient, constructive learning algorithms. The book is self-contained and accessible to researchers and graduate students in computer science, engineering, and mathematics.