Feature Extraction from web data using Artificial Neural Networks

  IJCOT-book-cover
 
International Journal of Computer Trends and Technology (IJCTT)          
 
© - Issue 2012 by IJCTT Journal
Volume-3 Issue-5                           
Year of Publication : 2012
Authors :Manoj Kumar Sharma, Vishal Shrivastav .

MLA

Manoj Kumar Sharma, Vishal Shrivastav ."Feature Extraction from web data using Artificial Neural Networks"International Journal of Computer Trends and Technology (IJCTT),V3(5):561-564 Issue 2012 .ISSN 2231-2803.www.ijcttjournal.org. Published by Seventh Sense Research Group.

Abstract: - The main ability of neural network is to learn from its environment and to improve its performance through learning. For this purpose there are two types of learning supervised or active learning – learning with an external ‘teacher’ or a supervisor who present a training set to the network. But another type of learning also exists : unsupervised learning[1] . Unsupervised learning is self organized learning doesn’t require an external teacher. During training session neural network receives a number of input patterns , discovers significant features in these patterns and learns how to classify input data into appropriate categories. It follows the neuro - biological organization of the brain. These algorithms aim to learn rapidly so learn much faster than back-propagation networks and thus can be used in real time. Unsupervised NN are effective in dealing with unexpected and changing conditions[3]. There are basically two major self – organising networks based learning : Hebbian and competitive learning. We will use Hebbian learning in this paper to visualize that how it can help in feature extraction from any data. We will use input vector weight matrix examples and denote presence of a feature by 1 and absence of a feature by 0. In this method we will see that how features are identified and we can discover patterns in given data. We can use this method for classification and clustering purpose also. Then we will apply this learning rule on web data or content for discovers pattern & extraction of features. Weight increases when same pattern repeats and decrease when it doesn’t repeat. The network associates some input xi with some outputs yi and yj because input xi and xj coupled during training. But it cannot associate some input x with some output y because that input didn’t appear during training and our network has lost the ability to recognize it.

References-

[1] Becker, S & Plumbley, M (1996). Unsupervised neural network learning procedures for feature extraction and classi?cation. International Journal of Applied Intelligence, 6, 185-203
[2] Mumford, D (1994). Neuronal architectures for pattern-theoretic problems. In C Koch and J Davis, editors, Large-Scale Theories of the Cortex. Cambridge, MA: MIT Press, 125-152.
[3] Artificial neural networks for pattern recognition B YEGNANARAYANA Scidhanci, Vol. 19, P a r t 2, April 1994, pp. 189-238.
[4] Artificial Intelligence: A Guide to Intelligent Systems, 2/E, Michael Negnevitsky. ISBN-10: 0321204662 ISBN-13: 9780321204660. Publisher: Addison-Wesley.
[5] An Introduction to Feature Extraction Isabelle Guyon, and Andr´e Elissee?
[6] S.Haykin, Neural Networks and Learning Machines,2010,PHI
[7] Xianjun Ni , Research of Data Mining Based on Neural Networks, World Academy of Science, Engineering and Technology ,39,2008, p 381-38

Keywords—Swarm intelligence, particle swarm optimization, transliteration, grapheme, phoneme, hybrid.