Home | Issues | Profile | History | Submission | Review
Vol: 50(64) No: 2 / June 2005        

Establishing the Information Machine Concept
Marius Crisan
Department of Computer and Software Engineering, University "Politehnica" of Timisoara, 1900 Timisoara, Romania, phone: (+40) 256-403256, e-mail: marius.crisan@cs.upt.ro, web: http://www.cs.utt.ro/~crisan/


Keywords: entropy, algorithmic randomness, information processing, cognitive modeling.

Abstract
The paper is a synthesis of the previous work of the author which lead conclusively to introducing a new concept of information machine, a classic term but redefined in a wider context of advanced information processing. Computers can be programmed to solve problems and make decisions but their performances still remain unsatisfactory when facing the full range of human thought and talent. Efficient human-like thinking and linguistic behavior are difficult to be produced in a computer. Yet there is no clear evidence that human thoughts formation is based on any mode of handling information that in principle could not be built into a machine. In this context, establishing the concept of information machine may shed some light and prove useful in understanding the issue.

References
[1] Crisan, M., "About Various Definitions of Entropy and Their Implications in Cognitive Sciences," Sci. & Tech. Bulletin of University "Politehnica" of Timisoara, 40(54), Transactions on Automatic Control and Computer Science, 1995.
[2] Crisan, M., "On Shannon, Fisher, and Algorithmic Entropy in Cognitive Systems," SACI-2004.
[3] Szilard, L.: “On the decrease of entropy in a thermodynamic system by the intervention of intelligent beings,” Z. Phys. 53, 840-856 (1929). Translation by A. Rapaport and M. Knoller, reprinted in Quantum Theory and Measurement, edited by J.A. Wheeler and W.H. Zurek, Princeton U.P., Princeton, 1983.
[4] Shannon, C.E., Weaver, W.: “The Mathematical Theory of Communication,” University of Illinois Press, Urbana, 1949.
[5] Chaitin G. J.: “Algorithmic Information Theory.” Cambridge: Cambridge University Press, 1987.
[6] Yockey, H.: “Information Theory and Molecular Biology,” Cambridge University Press, 1992.
[7] Crisan, M., “Physical entropy and self-organizing system modeling”, Proceedings of the International Conference on Cognitive Systems - ICCS'97, New Delhi, Dec., 1997.
[8] Machta, J.: "Entropy, information, and computation." Am. J. Phys., Vol. 67, No. 12, December 1999.
[9] Adami, C. and Cerf, N.J.: “Complexity, computation, and measurement,” in T. Toffoli, M. Biafore, and J. Leao (ed.), PhysComp96, New England Complex Systems Institute (1996), 7-11.
[10] M. Crisan. "Towards a broader information concept applied in cognitive modeling," Sci. & Tech. Bulletin of University "Politehnica" of Timisoara, 47(61), Transactions on Automatic Control and Computer Science, 2002.
[11] Crutchfield, J.P.: “Observing Complexity and The Complexity of Observation,” Proc. of the International Workshop on Endo/Exo-Problems in Physics, Ringberg Castle, Tegernsee, 1993.
[12] M. Crisan. “Thermodynamics of Information,” Sci. & Tech. Bulletin of University "Politehnica" of Timisoara, 44(58), Transactions on Automatic Control and Computer Science, 1999.
[13] Frieden, B.R.: “Physics from Fisher Information-A Unification,” Cambridge University Press, 1998.
[14] Li, M., Vitanyi, P., "An Introduction to Kolmogorov Complexity and Its Applications," Springer-Verlag, N.Y. (1993).
[15] Crisan, M., “Algorithmic randomness and self-organizing system modeling”, Sci. & Tech. Bulletin of University "Politehnica" of Timisoara, 42(56), Transactions on Automatic Control and Computer Science, 1997.