JACIII Vol.11 No.5 pp. 469-477
doi: 10.20965/jaciii.2007.p0469


Optimization Method RasID-GA for Numerical Constrained Optimization Problems

Dongkyu Sohn, Shingo Mabu, Kotaro Hirasawa, and Jinglu Hu

Graduate School of Information, Production and Systems, Waseda University, 2-7 Hibikino, Wakamatsu-ku, Kitakyushu-shi, Fukuoka 808-0135, Japan

November 13, 2006
April 23, 2007
June 20, 2007
optimization, RasID, GA, switching
This paper proposes Adaptive Random search with Intensification and Diversification combined with Genetic Algorithm (RasID-GA) for constrained optimization. In the previous work, we proposed RasID-GA which combines the best properties of RasID and Genetic Algorithm for unconstrained optimization problems. In general, it is very difficult to find an optimal solution for constrained optimization problems because their feasible solution space is very limited and they should consider the objective functions and constraint conditions. The conventional constrained optimization methods usually use penalty functions to solve given problems. But, it is generally recognized that the penalty function is hard to handle in terms of the balance between penalty functions and objective functions. In this paper, we propose a constrained optimization method using RasID-GA, which solves given problems without using penalty functions. The proposed method is tested and compared with Evolution Strategy with Stochastic Ranking using well-known 11 benchmark problems with constraints. From the Simulation results, RasID-GA can find an optimal solution or approximate solutions without using penalty functions.
Cite this article as:
D. Sohn, S. Mabu, K. Hirasawa, and J. Hu, “Optimization Method RasID-GA for Numerical Constrained Optimization Problems,” J. Adv. Comput. Intell. Intell. Inform., Vol.11 No.5, pp. 469-477, 2007.
Data files:
  1. [1] J. Matyas, “Random optimization,” Automation and Remote Control, Vol.26, pp. 244-251, 1965.
  2. [2] F. J. Solis and J. B. Wets, “Minimization by random search techniques,” Mathematics of Operations Research, Vol.6, pp. 19-30, 1981.
  3. [3] A. Torn and A. Zilinskas, “Global optimization,” in Lecture Notes in Computer Science, 350, Berlin Germany, Springer-Verlag, 1989.
  4. [4] K. Hirasawa, H. Miyazaki, and J. Hu, “Enhancement of RasID and Its Evaluation,” T.SICE, Vol.38, No.9, pp. 775-783, 2002.
  5. [5] K. Hirasawa, K. Togo, J. Hu, M. Ohbayashi, and J. Murata, “A New Adaptive Random Search Method in Neural Networks –RasID–,” T.SICE, Vol.34, No.8, pp. 1088-1096, 1998.
  6. [6] J. Hu, K. Hirasawa, and J. Murata, “RasID-Random Search for Neural Network Training,” Journal of Advanced Computational Intelligence, Vol.2, No.4, pp. 134-141, 1998.
  7. [7] J. Hu and K. Hirasawa, “Adaptive random search approach to identification of neural network model,” Proceedings of the 31st ISCIE international symposium on stochastic systems theory and its applications, Yokohama, pp. 73-78, Nov. 11-12, 1999.
  8. [8] Y. W. Leung and Y. Wang, “An orthogonal Genetic Algorithm with Quantization for Global Numerical Optimization,” IEEE Tran. on Evolutionary Computation, Vol.5, No.1, pp. 41-53, 2001.
  9. [9] P. A. Moscato, “On evolution, search, optimization, genetic algorithms and maritial arts,” Toward memetic algorithms, Caltech Concurrent Computation program, California Institute of Technology, Pasadena, Tech. Rep.790, 1989.
  10. [10] J. Holland, “Adaptation in Natural and Artificial System,” Ann Arbor, University of Michigan Press, 1975; MIT Press, 1992.
  11. [11] J. E. Baker, “Adaptive selection methods for genetic algorithms,” in Proc. of the first International Conference on Genetic Algorithms, pp. 101-111, 1985.
  12. [12] D. E. Goldberg, B. Korb, and K. Deb, “Messy genetic algorithm: Motivation analysis, and first results,” Complex Systems, Vol.3, pp. 493-530, 1989.
  13. [13] D. Molina, F. Herrera, and M. Lozano, “Adaptive Local Search Parameters for Real-Coded Memetic Algorithms,” Congress on Evolutionary Computation 2005 (CEC2005), pp. 888-895, 2005.
  14. [14] X. Yao and Y. Liu, “Fast evolution strategies,” in Evolutionary Programming VI, P. J. Angeline, R. Reynolds, J. McDonnell, and R. Eberhart (Eds.), Berlin, Germany, Springer-Verlag, pp. 151-161, 1997.
  15. [15] T. P. Runarsson and X. Yao, “Stochastic Ranking for Constrained Evolutionary Optimization,” IEEE Tran. on Evolutionary Computation, Vol.4, No.3, pp. 284-294, 2000.
  16. [16] D. Sohn, K. Hirasawa, and J. Hu, “Adaptive Random Search with Intensification and Diversification combined with Genetic Algorithm,” Congress on Evolutionary Computation 2005 (CEC2005), pp. 1462-1469, 2005.
  17. [17] S. Tsutsui and D. E. Goldberg, “Simplex Crossover and Linkage Identification: Single-Stage Evolution VS. Multi-Stage Evolution,” Proceedings of the 2002 Congress on Evolutionary Computation (CEC’02), pp. 974-979, 2002.
  18. [18] S. Tsutsui, M. Yamamura, and T. Higuchi, “Multi-parent Recombination with Simplex Crossover in Real Coded Genetic Algorithms,” Proceedings of the 1999 Genetic and Evolutionary Computation Conference (GECCO-99), pp. 657-664, 1999.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 19, 2024