5. Conclusion
Deterministic optimization methods, such as BFGS method, are known for their fast convergence for solving convex optimization problems. However, they tend to be trapped in local minima for non-convex problems. In this paper, we proposed two hybrid algorithms for constrained global optimization. Based on the exact penalty function method, the constrained optimization problems were transformed into box-constrained optimization problems. Then, a novel reposition technique, Greedy Diffusion Search (GDS), is proposed to integrate with limited memory BFGS (L-BFGS) under two different strategies, where GDS is to enable the algorithm to escape from local minima. We have shown that our algorithms are convergent to a global minimizer with probability one. 18 box constrained optimization problems and 4 general constrained optimization problems are solved by the proposed methods. The results obtained were compared with those obtained by existing methods. Comparisons show that our methods can achieve higher accuracy with less number of function evaluations. However, as LBFGS is used in our two hybrid methods, the functions involved in Problem (P) are required to be smooth. It is clearly an important task to develop effective hybrid algorithms to solve non-smooth optimization problems.