- Open Access
General iterative scheme based on the regularization for solving a constrained convex minimization problem
© Tian; licensee Springer. 2013
- Received: 2 January 2013
- Accepted: 10 October 2013
- Published: 22 November 2013
It is well known that the regularization method plays an important role in solving a constrained convex minimization problem. In this article, we introduce implicit and explicit iterative schemes based on the regularization for solving a constrained convex minimization problem. We establish results on the strong convergence of the sequences generated by the proposed schemes to a solution of the minimization problem. Such a point is also a solution of a variational inequality. We also apply the algorithm to solve a split feasibility problem.
MSC:47H09, 47H05, 47H06, 47J25, 47J05.
- averaged mapping
- gradient-projection algorithm
- constrained convex minimization
- split feasibility problem
- variational inequality
The gradient-projection algorithm is a classical power method for solving constrained convex optimization problems and has been studied by many authors (see [1–14] and the references therein). The method has recently been applied to solve split feasibility problems which find applications in image reconstruction and the intensity modulated radiation therapy (see [15–22]).
Consider the problem of minimizing f over the constraint set C (assuming that C is a nonempty closed and convex subset of a real Hilbert space H). If is a convex and continuously Fréchet differentiable functional, the gradient-projection algorithm generates a sequence determined by the gradient of f and the metric projection onto C. Under the condition that f has a Lipschitz continuous and strongly monotone gradient, the sequence can be strongly convergent to a minimizer of f in C. If the gradient of f is only assumed to be inverse strongly monotone, then can only be weakly convergent if H is infinite-dimensional.
Recently, Xu  gave an operator-oriented approach as an alternative to the gradient-projection method and to the relaxed gradient-projection algorithm, namely, an averaged mapping approach. He also presented two modifications of gradient-projection algorithms which are shown to have strong convergence.
On the other hand, regularization, in particular the traditional Tikhonov regularization, is usually used to solve ill-posed optimization problems [24, 25]. Under some conditions, we know that the regularization method is weakly convergent.
The purpose of this paper is to present the general iterative method combining the regularization method and the averaged mapping approach. We first propose implicit and explicit iterative schemes for solving a constrained convex minimization problem and prove that the methods converge strongly to a solution of the minimization problem, which is also a solution of the variational inequality. Furthermore, we use the above method to solve a split feasibility problem.
Throughout the paper, we assume that H is a real Hilbert space whose inner product and norm are denoted by and , respectively, and that C is a nonempty closed convex subset of H. The set of fixed points of a mapping T is denoted by , that is, . We write to indicate that the sequence converges weakly to x. The fact that the sequence converges strongly to x is denoted by . The following definition and results are needed in the subsequent sections.
where is a constant. In particular, if , then T is called a contraction on H; if , then T is called a nonexpansive mapping on H. T is called firmly nonexpansive if is nonexpansive, or equivalently, , . Alternatively, T is firmly nonexpansive if and only if T can be expressed as , where is nonexpansive.
where α is a number in and is nonexpansive. More precisely, when (2) holds, we say that T is α-averaged. Clearly, a firmly nonexpansive mapping (in particular, projection) is a -averaged map.
If for some and if W is averaged and V is nonexpansive, then T is averaged.
T is firmly nonexpansive if and only if the complement is firmly nonexpansive.
If for some and if W is firmly nonexpansive and V is nonexpansive, then T is averaged.
The composite of finitely many averaged mappings is averaged. That is, if each of the mappings is averaged, then so is the composite . In particular, if is -averaged and is -averaged, then the composite is α-averaged, where .
- (i)if and only if
- (ii)if and only if
Consequently, is nonexpansive.
Lemma 2.3 
Lemma 2.4 (Demiclosedness principle )
Let C be a closed and convex subset of a Hilbert space H, and let be a nonexpansive mapping with . If is a sequence in C weakly converging to x and if converges strongly to y, then . In particular, if , then .
- (i)monotone if
- (ii)β-strongly monotone if there exists such that
- (iii)ν-inverse strongly monotone (for short, ν-ism) if there exists such that
Proposition 2.2 
T is nonexpansive if and only if the complement is -ism.
If T is ν-ism, then for , γT is -ism.
T is averaged if and only if the complement is ν-ism for some . Indeed, for , T is α-averaged if and only if is -ism.
Lemma 2.5 
where, in both (5) and (6), the initial guess is taken from C arbitrarily, the parameters γ or are positive real numbers.
under some assumption for γ or , then algorithms (5) and (6) can still converge in the weak topology.
where is the regularization parameter, and again f is convex with a -ism gradient ∇f.
We also know that , where is a solution of constrained convex minimization problem (4).
Hence, has a unique fixed point in C, denoted by , which uniquely solves fixed point equation (7).
where , and for each . It is proven that this sequence converges strongly to a minimizer of (4).
3.1 Convergence of the implicit scheme
where , .
where . The same case holds for .
where . □
defines a continuous curve from into C.
- (iii)Take , and calculate(12)
Proof Set , . Let , .
So, , which is also the unique solution of the variational inequality. □
3.2 Convergence of the explicit scheme
So, is bounded.
(3∘) Next we show that .
by Lemma 2.5 and , , , we have . □
where C and Q are nonempty closed convex subsets of Hilbert spaces and , respectively. is a bounded linear operator.
It is clear that is a solution of split feasibility problem (18) if and only if and .
He obtained that the sequence generated by (20) converges weakly to a solution of the (SFP).
Applying Theorem 3.2, we obtain the following result.
Then the sequence converges strongly to the solution of split feasibility problem (18).
where , and .
Due to Theorem 3.2, we have the conclusion immediately. □
The author read and approved the final manuscript.
The author thanks the referees for their helpful comments, which notably improved the presentation of this manuscript. This work was supported by the Fundamental Research Funds for the Central Universities (the Special Fund of Science in Civil Aviation University of China: No. 3122013K004).
- Levitin ES, Polyak BT: Constrained minimization methods. Zh. Vychisl. Mat. Mat. Fiz. 1966, 6: 787–823.MATHGoogle Scholar
- Calamai PH, More JJ: Projected gradient methods for linearly constrained problem. Math. Program. 1987, 39: 93–116. 10.1007/BF02592073MATHMathSciNetView ArticleGoogle Scholar
- Polyak BT: Introduction to Optimization. Optimization Software, New York; 1987.Google Scholar
- Su M, Xu HK: Remarks on the gradient-projection algorithm. J. Nonlinear Anal. Optim. 2010, 1: 35–43.MathSciNetGoogle Scholar
- Moudafi A: Viscosity approximation methods for fixed-points problems. J. Math. Anal. Appl. 2000, 241: 46–55. 10.1006/jmaa.1999.6615MATHMathSciNetView ArticleGoogle Scholar
- Xu HK: Viscosity approximation methods for nonexpansive mappings. J. Math. Anal. Appl. 2004, 298: 279–291. 10.1016/j.jmaa.2004.04.059MATHMathSciNetView ArticleGoogle Scholar
- Ceng LC, Ansari QH, Yao JC: Some iterative methods for finding fixed points and for solving constrained convex minimization problems. Nonlinear Anal. TMA 2011, 74: 5286–5302. 10.1016/j.na.2011.05.005MATHMathSciNetView ArticleGoogle Scholar
- Marino G, Xu HK: Convergence of generalized proximal point algorithm. Commun. Pure Appl. Anal. 2004, 3: 791–808.MATHMathSciNetView ArticleGoogle Scholar
- Marino G, Xu HK: A general iterative method for nonexpansive mappings in Hilbert spaces. J. Math. Anal. Appl. 2006, 318: 43–52. 10.1016/j.jmaa.2005.05.028MATHMathSciNetView ArticleGoogle Scholar
- Tian M: A general iterative algorithm for nonexpansive mappings in Hilbert spaces. Nonlinear Anal. TMA 2010, 73: 689–694. 10.1016/j.na.2010.03.058MATHView ArticleGoogle Scholar
- Yao Y, Liou YC, Chen CP: Algorithms construction for nonexpansive mappings and inverse-strongly monotone mapping. Taiwan. J. Math. 2011, 15: 1979–1998.MATHMathSciNetGoogle Scholar
- Yao Y, Chen R, Liou Y-C: A unified implicit algorithm for solving the triple-hierarchical constrained optimization problem. Math. Comput. Model. 2012, 55: 1506–1515. 10.1016/j.mcm.2011.10.041MATHMathSciNetView ArticleGoogle Scholar
- Yao Y, Liou Y-C, Kang SM: Two-step projection methods for a system of variational inequality problems in Banach spaces. J. Glob. Optim. 2013. 10.1007/s10898-011-9804-0Google Scholar
- Wiyada K, Praairat J, Poom K: Generalized systems of variational inequalities and projection methods for inverse-strongly monotone mapping. Discrete Dyn. Nat. Soc. 2011. 10.1155/2011/976505Google Scholar
- Censor Y, Elfving T: A multiprojection algorithm using Bregman projections in a product space. Numer. Algorithms 1994, 8: 221–239. 10.1007/BF02142692MATHMathSciNetView ArticleGoogle Scholar
- Byrne C: A unified treatment of some iterative algorithms in signal processing and image reconstruction. Inverse Probl. 2004, 20: 103–120. 10.1088/0266-5611/20/1/006MATHMathSciNetView ArticleGoogle Scholar
- Censor Y, Elfving T, Kopf N, Bortfeld T: The multiple-sets split feasibility problem and its applications for inverse problem. Inverse Probl. 2005, 21: 2071–2084. 10.1088/0266-5611/21/6/017MATHMathSciNetView ArticleGoogle Scholar
- Censor Y, Bortfeld T, Martin B, Trfimov A: A unified approach for inversion problems in intensity-modulated radiation therapy. Phys. Med. Biol. 2006, 51: 2353–2365. 10.1088/0031-9155/51/10/001View ArticleGoogle Scholar
- Xu HK: A variable Krasnosel’skii-Mann algorithm and the multiple-set split feasibility problem. Inverse Probl. 2006, 22: 2021–2034. 10.1088/0266-5611/22/6/007MATHView ArticleGoogle Scholar
- Xu HK: Iterative methods for the split feasibility problem in infinite-dimensional Hilbert space. Inverse Probl. 2010., 26: Article ID 105018Google Scholar
- Lopez G, Martin V, Xu HK: Perturbation techniques for nonexpansive mapping with applications. Nonlinear Anal., Real World Appl. 2009, 10: 2369–2383. 10.1016/j.nonrwa.2008.04.020MATHMathSciNetView ArticleGoogle Scholar
- Lopez G, Martin V, Xu HK: Iterative algorithms for the multiple-sets split feasibility problem. In Biomedical Mathematics: Promising Directions in Imaging, Therapy Planning and Inverse Problems. Edited by: Censor Y, Jiang M, Wang G. Medical Physica Publishing, Madison; 2009:243–279.Google Scholar
- Xu HK: Averaged mapping and the gradient-projection algorithm. J. Optim. Theory Appl. 2011, 150: 360–378. 10.1007/s10957-011-9837-zMATHMathSciNetView ArticleGoogle Scholar
- Xu HK: A regularization method for the proximal point algorithm. J. Glob. Optim. 2006, 36: 115–125. 10.1007/s10898-006-9002-7MATHView ArticleGoogle Scholar
- Cho YJ, Petrot N: Regularization and iterative method for general variational inequality problem in Hilbert spaces. J. Inequal. Appl. 2011., 2011: Article ID 21Google Scholar
- Combettes PL: Solving monotone inclusions via composition of nonexpansive averaged operators. Optimization 2004, 53(5–6):475–504. 10.1080/02331930412331327157MATHMathSciNetView ArticleGoogle Scholar
- Geobel K, Kirk WA Cambridge Studies in Advanced Mathematics 28. In Topics on Metric Fixed-Point Theory. Cambridge University Press, Cambridge; 1990.View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.