Continuous optimization seems to be the ubiquitous formulation of an impressive number of different problems in science and engineering. In this chapter, a unified framework for problem solving is proposed in the continuum setting which is based on the notion of action, a sort of continuous algorithm running on an abstract machine, referred to as the deterministic terminal attractor machine (DTAM), somehow related to discrete computational counterparts. A number of examples are given which illustrate how continuous algorithms can be devised. The proposed general computational scheme incorporates most interesting supervised and unsupervised learning schemes in artificial neural networks as well as the problem solving approach based on Hopfield networks. Finally, a general discussion on computational complexity issues indicates some intriguing links between the presence of local minima in the error surface of the energy function and the complexity of the solution.
|Titolo:||Continuous problem solving and computational suspiciousness|
|Citazione:||Gori, M. (2003). Continuous problem solving and computational suspiciousness. In Limitations and Future Trends in Neural Computation (pp. 1-22). Amsterdam : IOS Publishing.|
|Appare nelle tipologie:||2.1 Contributo in volume (Capitolo o Saggio)|
File in questo prodotto: