AnyBook4Less.com
Find the Best Price on the Web
Order from a Major Online Bookstore
Developed by Fintix
Home  |  Store List  |  FAQ  |  Contact Us  |  
 
Ultimate Book Price Comparison Engine
Save Your Time And Money

Multi-Valued and Universal Binary Neurons: Theory, Learning, and Applications

Please fill out form in order to compare prices
Title: Multi-Valued and Universal Binary Neurons: Theory, Learning, and Applications
by Igor N. Aizenberg, Naum Nisonovich Aizenberg, J. Vandewalle
ISBN: 0-7923-7824-5
Publisher: Kluwer Academic Publishers
Pub. Date: 01 April, 2000
Format: Hardcover
Volumes: 1
List Price(USD): $179.00
Your Country
Currency
Delivery
Include Used Books
Are you a club member of: Barnes and Noble
Books A Million Chapters.Indigo.ca

Average Customer Rating: 2 (1 review)

Customer Reviews

Rating: 2
Summary: A suprizing generalization of perceptron
Comment: The reviewed book is dedicated to an extension of the perceptron, which in its initial form is able to classify correctly only linearly separable patterns. Minsky and Papert have suggested in their seminal book Perceptron (published in 1969) that this serious shortcoming may be surmounted by two different ways: The first way was an introduction of the so-called higher-order input activities that are represented by products of single input activities (e.g. x1„ªx2), while the second way employed hidden neurons. Minsky and Papert have rejected both these simple and straightforward extensions of the perceptron theory mainly due to nonexistence of a proper learning algorithm. In the reviewed book is extensively discussed another alternative way how to generalized perceptron towards an ability to classify patterns that are not linearly separable. The idea is very simple, authors postulated that weight coefficients may be complex numbers and that a respective activation function is determined as follows:

f(z)=1 if 0<=arg(z) f(z)=0 if Pi/2<=arg(z)Then it is easy to demonstrate that XOR logical function is realizable by this extension of perceptron. The whole book consists of different extensions of the above simple idea that are able to realize more complicated Boolean functions (in particular the so-called k-valued Boolean threshold functions). The notion of linear separability is extended to the so-called P-realizable functions, then multi-valued Boolean threshold functions may be correctly realized. Moreover, it is demonstrated that an incremental perceptron learning may be modified to adjust complex weight coefficients, so that multi-valued Boolean threshold functions are realized. At the end of the book illustrative applications are presented that demonstrate an effectiveness of the proposed method (e.g. an associative memory for gray-scale images processing). The book is written in a highly sophisticated style employing mathematical concepts (e.g. group theory) unusual in neural networks. What we may accept from the book, that is substantial enough to be included in neural-network lectures? The two extensions of percetron to overcome a linear-separability block suggested by Minsky and Papert may be completed by the third possible extension based on complex weight coefficients. This is an interesting fact, but I would recommend the book only to a reader, which already knows neural networks really well, likes mathematics, and is specifically interested in perceptron learning.

Thank you for visiting www.AnyBook4Less.com and enjoy your savings!

Copyright� 2001-2021 Send your comments

Powered by Apache