Self-Organizing Information for Approaching AI by Relative Entropy
|© 2021 by IJCTT Journal|
|Year of Publication : 2021|
|Authors : I-HO Lee|
|DOI : 10.14445/22312803/IJCTT-V69I7P109|
How to Cite?
I-HO Lee, "Self-Organizing Information for Approaching AI by Relative Entropy," International Journal of Computer Trends and Technology, vol. 69, no. 7, pp. 58-69, 2021. Crossref, https://doi.org/10.14445/22312803/IJCTT-V69I7P109
In a dynamic environment, creatures can adapt proper actions to interact or deal with other objects that are required to identify the relationship of self-behaviors and outside environmental data. Furthermore, before creatures have properly acted, the creatures have experienced or learned a successful way to handle the same problems or situations. In this article, we want to demonstrate this identifying mechanism of creatures, which leads to the creatures properly interacting and learning response with the environment. Those proper interactions with environmental data can be observed by our sensory organs. Our sensory organs receive data from an environment that is a basic fundamental function to learn the knowledge or model for proper action. However, there is a lot of data, and we need to identify the relation of different data. We use information entropy to measure the happened times of data. This kind of formulation is helping us to find out the relation between different data and data of behaviors. Once we can identify the connection between our behaviors and environmental data, that connection becomes a logic for judgment and guiding our interactions more properly. Furthermore, we had coded a program to demonstrate this dynamic environment and interacting behaviors by combined particle swarm optimization (PSO) and information entropy. Moreover, our program is presenting that the learning way of a creature is constructed by the identifying mechanism to form our logic. The idea of what we propose is basic philosophy, pso, statistics, and behavioral psychology. Those fields help us to figure out the steps of our thinking and design the program.
Central limit theorem, information entropy, max entropy, knowledge diffusion, PSO
 Stuart Russell and Peter Norvig., Artificial Intelligence: A Modern Approach , Edition 3, Prentice Hall., (2002)
 X.-S. Yang; S. Deb ., Cuckoo search via Lévy flights, World Congress on Nature & Biologically Inspired Computing. IEEE Publications. Paper core summary, 210–214.http://papercore.org/Yang2009
 Yang, X. S., Firefly Algorithm, Stochastic Test Functions, and Design Optimisation., Paper core summary., http://papercore.org/Yang2009., (2008) 1-11.
 Kennedy, J.; Eberhart, R.., Particle Swarm Optimization., Proceedings of IEEE International Conference on Neural Networks., 4(1942–1948).
 Ujjwal Maulik, Sanghamitra Bandyopadhyay (29 April 1999) " Genetic algorithm-based clustering technique, Pattern Recognition., 33(1455) (1465) 1455-1465.
 Ahmad Rabanimotlagh Bursa., An Efficient Ant Colony Optimization Algorithm for Multiobjective Flow Shop Scheduling Problem" World Academy of Science, Engineering and Technology., 51(2011) 127-133.
 Kirkpatrick, S.; Gelatt, C. D.; Vecchi, M. P., Optimization by Simulated Annealing., Science 220(4598) (1983) 671-680.
 Nishith Pathak; Arindam Banerjee; Jaideep Srivastava., A Generalized Linear Threshold Model for Multiple Cascades., 2010 IEEE International Conference on Data Mining, DOI: 10.1109/ICDM.2010.153, ISSN: 2374-8486