Hybrid Genetic Algorithm and Newton’s Method for Large-Scale Optimization Problems
Abstract
Large-scale optimization problems are pervasive in science, engineering, and industry, often characterized by complex, multimodal landscapes and high dimensionality. Traditional optimization methods, such as genetic algorithms (GAs) and Newton’s method, each have distinct strengths and weaknesses: GAs are robust global searchers but can be slow to converge, while Newton’s method is a powerful local optimizer but sensitive to initial guesses and prone to getting trapped in local minima. Hybridizing these methods leverages their complementary strengths, enabling efficient and reliable optimization for large-scale problems. This article presents the principles, implementation, and performance of hybrid genetic algorithm–Newton methods, with a focus on the hybrid genetic deflated Newton (HGDN) approach. Numerical experiments demonstrate that these hybrids outperform standalone algorithms in terms of convergence speed, accuracy, and the ability to find multiple optima.
How to Cite This Article
Rajesh Kumar Gupta, Emily Clarkson (2025). Hybrid Genetic Algorithm and Newton’s Method for Large-Scale Optimization Problems . International Journal of Applied Mathematics and Numerical Research (IJAMNR), 1(4), 04-06.