**Peer Review Journal ** DOI on demand of Author (Charges Apply) ** Fast Review and Publicaton Process ** Free E-Certificate to Each Author

Current Issues
     2026:2/3

International Journal of Applied Mathematics and Numerical Research

ISSN: (Print) | 3107-7110 (Online) | Impact Factor: 8.62 | Open Access

Hybrid Genetic Algorithm and Newton’s Method for Large-Scale Optimization Problems

Full Text (PDF)

Open Access - Free to Download

Download Full Article (PDF)

Abstract

Large-scale optimization problems are pervasive in science, engineering, and industry, often characterized by complex, multimodal landscapes and high dimensionality. Traditional optimization methods, such as genetic algorithms (GAs) and Newton’s method, each have distinct strengths and weaknesses: GAs are robust global searchers but can be slow to converge, while Newton’s method is a powerful local optimizer but sensitive to initial guesses and prone to getting trapped in local minima. Hybridizing these methods leverages their complementary strengths, enabling efficient and reliable optimization for large-scale problems. This article presents the principles, implementation, and performance of hybrid genetic algorithm–Newton methods, with a focus on the hybrid genetic deflated Newton (HGDN) approach. Numerical experiments demonstrate that these hybrids outperform standalone algorithms in terms of convergence speed, accuracy, and the ability to find multiple optima.

How to Cite This Article

Rajesh Kumar Gupta, Emily Clarkson (2025). Hybrid Genetic Algorithm and Newton’s Method for Large-Scale Optimization Problems . International Journal of Applied Mathematics and Numerical Research (IJAMNR), 1(4), 04-06.

Share This Article: