Biography
Dr. Manu Pratap Singh
Dr. Manu Pratap Singh
Dr. B. R. Ambedkar University, India
Title: Implementation of Sub optimal Genetic algorithm for pattern recalling from Hopfield neural networks
Abstract: 

Hopfield proposed a fully connected neural network model of associative memory in which we can store information by distributing it among neurons, and recall it from the dynamically relaxed neuron states. If we map these states corresponding to certain desired memory vectors, then the time evolution of dynamics leads to a stable state. These stable states of the networks represent the stored patterns. Hopfield used the Hebbian learning rule to prescribe the weight matrix for establishing these stable states. A major drawback of this type of neural networks is that the memory attractors are constantly accompanied with a huge number of spurious memory attractors so that the network dynamics is very likely to be trapped in these attractors, and thereby prevents the retrieval of the memory attractors. Hopfield type networks also likely are trapped in non-optimal local minima close to the starting point, which is not desired. The presence of false minima will increase the probability of error in recall of the stored pattern. The problem of false minima can be reduced by adopting the evolutionary algorithm to accomplish the search for global minima. There have been a lot of researchers who apply evolutionary techniques (simulated annealing and Genetic algorithm) to minimize the problem of false minima. Imades& Akira have applied evolutionary computation to Hopfield neural networks in various ways. A rigorous treatment of the capacity of the Hopfield associative memory can be found in. The Genetic algorithm has been identified as one of prominent search technique for exploring the global minima in Hopfield neural network. 


Developed by Holland, a Genetic algorithm is a biologically inspired search technique. In simple terms, the technique involves generating a random initial population of individuals, each of which represents a potential solution to a problem. Each member of this population evaluates from a fitness function which is selected against some known criteria. The selected members of the population from the fitness function are used to generate the new population as the members of the population are then selected for reproduction based potential solutions from the operations of the genetic algorithm. The process of evaluation, selection, and recombination is iterated until the population converges to an acceptable optimal solution. Genetic algorithms (GAs) require only fitness information, not gradient information or other internal knowledge of a problem as in case of neural networks. Genetic algorithms have traditionally been used in optimization but, with a few enhancements, can perform classification, prediction and pattern association as well. The GA has been used very effectively for function optimization and it can perform efficient searching for approximate global minima. It has been observed that the pattern recalling in the Hopfield type neural networks can be performed efficiently with GA. The GA in this case is expected to yield alternative global optimal values of the weight matrix corresponding to all stored patterns. The conventional Hopfield neural network suffers from the problem of non-convergence and local minima on increasing the complexity of the network. However, GA is particularly good to perform efficient searching in large and complex space to find out the global optima and for convergence. Considerable research into the Hopfield network has shown that the model may trap into four types of spurious attractors. Four well identified classes of these attractors are mixture states, spin glass states, compliment states and alien attractors. As the complexity of the of the search space increases, GA presents an increasingly attractive alternative for pattern storage & recalling in Hopfield type neural networks of associative memory. 


Much work has been done on the evolution of neural networks with GA. There have been a lot of researches which apply evolutionary techniques to layered neural networks. However, their applications to fully connected neural networks remain few so far. The first attempt to conjugate evolutionary algorithms with Hopfield neural networks dealt with training of connection weights and design of the neural network architecture, or both. Evolution has been introduced in neural networks at three levels: architectures, connection weights and learning rules. The evolution of connection weights proceeds at the lowest level on the fastest time scale in an environment determined by architecture, a learning rule, and learning tasks. The evolution of connection weights introduces an adaptive and global approach to training, especially in the reinforcement learning and recurrent network learning paradigm. Training of neural networks using evolutionary algorithms started in the beginning of 90’s . Reviews can be found in. Cardenas et al. presented the architecture optimization of neural networks using parallel genetic algorithms for pattern recognition based on person faces. They compared the results of the training stage for sequential and parallel implementations. The genetic evolution has been used as data structures processing for image classification. 


The work on which we are mainly focusing is to studying the performance of Hopfield neural network for recalling of stored patterns with Hebbian rule and genetic algorithm for static images of English alphabets. A lot of learning rules have been suggested to improve the performance of Hopfield Neural Network. The conventional Hopfield model uses bipolar product rule or hebbian learning rule to encode the pattern information in the form of weight matrix. This one-shot Hebbian learning provides relatively poor capacity and recalling performance of the network.Here, we have employed genetic algorithm on Hopfield Neural Network to obtain such an optimal weight matrix on which the recalling of memorized patterns corresponding to the presented noisy prototype input patterns could improve. The objective of this study is to determine the optimal weight matrix for correct recalling of static images. For this purpose we have considered scanned static images of English alphabets as input stimuli and this input stimuli has preprocessed, filtered using Edge dilation method and represented as pattern information by using Self Organizing Map (SOM). The code words generated from the SOM are input into the Hopfield Neural Network as pattern information using Hebbian rule. The simulated results demonstrate the better performance of Hopfield neural network for recalling of the stored static images of English alphabets using genetic algorithm. 


Keywords: Pattern Storage, Genetic Algorithm (GA), Hopfield neural network, Pattern recognition, Pattern association.

Biography: 
Dr. Manu Pratap Singh received his Ph.D. in Computer science from Kumaun University Nanital, Uthrakhand, India, in 2001. He has completed his Master of Science in Computer Science from Allahabad University, Allahabad in 1995. Further he obtained the M. Tech. in Information technology from Mysore. He is currently asAssociate Professor in Department of Computer Science, Institute of Engineering and Technology, Dr. B.R. Ambedkar University, Agra, UP, India since 2008. He is engaged in teaching and research since last 16 years. He has more than 80 research papers in journals of international and national repute. His work has been recognized widely around the world in the form of citations of my research papers. He also has received the Young Scientist Award in computer science by international Academy of Physical sciences, Allahabad in year 2005. He has guided 18 students for their doctorate in computer science. He is also referee of various international and national journals like International Journal of Uncertainty, Fuzziness and Knowledge Based Systems by World scientific publishing cooperation Ltd, International Journal of Engineering, Iran, IEEE Transaction of fuzzy systems and European journal of operation research by Elsevier. He has developed a feed forward neural networks simulator for hand written character recognition of English alphabets. He has also developed a hybrid evolutionary algorithm for hand written character recognition of English as well as for Hindi language classification. In the hybrid approach the Genetic algorithm is incorporated with back propagation learning rule to train the feed forward neural networks. In this approach the genetic algorithm starts from the suboptimal solution and converges for the optimal solutions. There are more than one optimal solution has obtained. This approach leads for the multi objective optimization phenomena. Another hybrid approach of evolutionary algorithm is developed for the feedback neural network of Hopfield type for efficient recalling for the memorized patterns. Here also the randomness from the genetic algorithm is minimized by starting it from the suboptimal solution in the term of parent weight matrix for the global optimal solutions i.e. correct weight matrices for the network to consider it for efficient pattern recalling. His research interests are focused on Neural networks, pattern recognition and machine intelligence, soft-computing, etc. He is a member of technical committee of IASTED, Canada since 2004. He is also the regular member of machine intelligence Research Labs (MIR Labs), scientific network for innovation and research excellence (SNIRE), Auburn, Washington, USE, http://www.mirlabs.org, since 2012.His Google citation indices are 9, i10-index is 8 and he has 257 citations. He has been invited as keynote speaker and invited guest speaker in various institutions in India and Abroad.