 Research
 Open Access
 Published:
Hybrid CQ projection algorithm with linesearch process for the split feasibility problem
Journal of Inequalities and Applications volume 2016, Article number: 106 (2016)
Abstract
In this paper, we propose a hybrid CQ projection algorithm with two projection steps and one Armijotype linesearch step for the split feasibility problem. The linesearch technique is intended to construct a hyperplane that strictly separates the current point from the solution set. The next iteration is obtained by the projection of the initial point on a regress region (the intersection of three sets). Hence, algorithm converges faster than some other algorithms. Under some mild conditions, we show the convergence. Preliminary numerical experiments show that our algorithm is efficient.
Introduction
Split feasibility problem (SFP) is to find a point x satisfying
where C and Q are nonempty convex sets in \(\Re^{N}\) and \(\Re^{M}\), respectively, and A is an M by N real matrix. The SFP was originally introduced in [1] and has broad applications in many fields, such as image reconstruction problem, signal processing, and radiation therapy [2–4]. Various algorithms have been invented to solve it (see [5–16] and references therein). The wellknown CQ algorithm presented in [1] is defined as follows: Denote by \(P_{C}\) the orthogonal projection onto C, that is, \(P_{C}(x)=\arg\min_{y\in C}\Vert xy\Vert \) for \(x\in C\); then take an initial point \(x^{0}\) arbitrarily and define the iterative step by
where \(0<\gamma<2/\rho(A^{T}A)\), and \(\rho(A^{T}A)\) is the spectral radius of \(A^{T}A\).
The algorithms mentioned use a fixed stepsize restricted by the Lipschitz constant L, which depends on the largest eigenvalue (spectral radius) of the matrix. We know that computing the largest eigenvalue may be very hard and conservative estimate of the constant L usually results in slow convergence. To overcome the difficulty in estimating the Lipschitz constant, He et al. [17] developed a selfadaptive method for solving variational problems. The numerical results reported in [17] have shown that the selfadaptive strategy is valid and robust for solving variational inequality problems. Subsequently, many selfadaptive projection methods were presented to solve the split feasibility problem [18, 19]. On the other hand, hybrid projection method was developed by Nakajo and Takahashi [20], Kamimure and Takahashi [21], and MartinesYanes and Xu [22] to solve the problem of finding a common element of the set of fixed points of a nonexpansive mapping and the set of solutions of an equilibrium problem. Many modified hybrid projection methods were presented to solve different problems [23, 24].
In this paper, motivated by the selfadaptive method and hybrid projection method for solving variational inequality problem, based on the CQ algorithm for the SFP, we propose a hybrid CQ projection algorithm for the split feasibility problem, which uses different variable stepsizes in two projection steps. Algorithm performs a computationally inexpensive Armijotype linear search along the search direction in order to generate separating hyperplane, which is different from the general selfadaptive Armijotype procedure [18, 19]. For the second projection step, we select the projection onto the intersection of the set C with the halfspaces, which makes an optimal stepsize available at each iteration and hence guarantees that the next iteration is the ‘closest’ to the solution set. Therefore, the iterative sequence generated by the algorithm converges more quickly. The algorithm is shown to be convergent to a point in the solution set under some assumptions.
The paper is organized as follows. In Section 2, we recall some preliminaries. In Section 3, we propose a hybrid CQ projection algorithm for the split feasibility problem and show its convergence. In Section 4, we give an example to test the efficiency. In Section 5, we give some concluding remarks.
Preliminaries
We denote by I the identity operator and by \(\operatorname {Fix}(T)\) the set of fixed points of an operator T, that is, \(\operatorname {Fix}(T):=\{x  x=Tx\}\).
Recall that a mapping \(T: \Re^{n}\rightarrow\Re\) is said to be monotone if
For a monotone mapping T, if \(\langle T(x)T(y), xy\rangle= 0\) iff \(x=y\), then it is said to be strictly monotone.
A mapping \(T: \Re^{n}\rightarrow\Re^{n}\) is called nonexpansive if
Lemma 2.1
Let Ω be a nonempty closed and convex subset in H. Then, for any \(x,y\in H\) and \(z\in\Omega\), it is well known that the following statements hold:

(1)
\(\langle P_{\Omega}(x)x, zP_{\Omega}(x)\rangle\geq0\).

(2)
\(\langle P_{\Omega}(x)P_{\Omega}(y), xy\rangle\geq0\).

(3)
\(\Vert P_{\Omega}(x)P_{\Omega}(y)\Vert \leq \Vert xy\Vert \), \(\forall x,y \in \Re^{n}\), or more precisely,
$$\bigl\Vert P_{\Omega}(x)P_{\Omega}(y)\bigr\Vert ^{2} \leq \Vert xy\Vert ^{2}\bigl\Vert P_{\Omega}(x)x+yP_{\Omega}(y) \bigr\Vert ^{2}. $$ 
(4)
\(\Vert P_{\Omega}(x)z\Vert \leq \Vert xz\Vert \Vert P_{\Omega}(x)x\Vert \).
Remark 2.1
In fact, the projection property (1) also provides a sufficient and necessary condition for a vector \(u\in K\) to be the projection of the vector x; that is, \(u=P_{K}(x)\) if and only if
Throughout the paper, we by denote Γ the solution set of split feasibility problem, that is,
Algorithm and convergence analysis
Let
From [10] we know that F is Lipschitzcontinuous with constant \(L=1/\rho(A^{T}A)\) and is \(\frac{1}{\Vert A\Vert }\)inverse strongly monotone. We first note that the solution set coincides with zeros of the following projected residual function:
with this definition, we have \(e(x,1)=e(x)\), and \(x\in\Gamma\) if and only if \(e(x,\mu)=0\). For any \(x\in\Re^{N}\) and \(\alpha\geq0\), define
The following lemma is useful for the convergence analysis in the next section.
Lemma 3.1
[18]
Let F be a mapping from \(\Re^{N}\) into \(\Re^{N}\). For any \(x\in\Re^{N}\) and \(\alpha\geq0\), we have
The detailed iterative processes are as follows:
Algorithm 3.1
Step 0. Choose an arbitrary initial point \(x^{0}\in C\), \(\eta_{0}>0\), and three parameters \(\gamma\in(0,1)\), \(\sigma\in(0,1)\), and \(\theta >1\), and set \(k=0\).
Step 1. Given the current iterative point \(x^{k}\), compute
where \(\mu_{k}:=\min\{\theta\eta_{k1},1\}\). Obviously, \(e(x^{k},\mu_{k})=x^{k}z^{k}\). If \(e(x^{k}, \mu_{k})=0\), then stop;
Step 2. Compute
where \(\eta_{k}=\gamma^{m_{k}}\mu_{k}\) with \(m_{k}\) being the smallest nonnegative integer m satisfying
Step 3. Compute
where
Select \(k=k+1\) and go to Step 1.
Remark 3.1
(1) In the algorithm, a projection from \(\Re^{N}\) onto the intersection \(C\cap H_{k}^{1}\cap H_{k}^{2}\) needs to be computed, that is, procedure (3.3) at each iteration. Surely, if the domain set C has a special structure such as a box or a ball, then the next iteration \(x^{k+1}\) can easily be computed. If the domain set C is defined by a set of linear (in)equalities, then computing the projection is equivalent to solving a strictly convex quadratic optimization problem.
(2) It can readily be verified that the hyperplane \(H_{k}^{1}\) strictly separates the current point \(x^{k}\) from the solution set Γ if \(x^{k}\) is not a solution of the problem. That is, \(\Gamma\subset H_{k}^{}\), and the hyperplane \(H_{k}^{2}\) strictly separates the initial point \(x^{0}\) from the solution set Γ.
(3) Compared with the general hybrid projection method in [21, 22], besides the major modification made in the projection domain in the last projection step, the values of some parameters involved in the algorithm are also adjusted.
Before establishing the global convergence of Algorithm 3.1, we first give some lemmas.
Lemma 3.2
For all \(k\geq0\), there exists a nonnegative number m satisfying (3.2).
Proof
Suppose that, for some k, (3.2) is not satisfied for any integer m, that is,
By the definition of \(e (x^{k},\mu_{k})\) and Lemma 2.1 we know that
Then
Since \(\gamma\in(0,1)\) and \(\mu_{k}:=\min\{\theta\eta_{k1},1\}\), from (3.4) we get
Hence,
But (3.6) contradicts (3.5) because \(\sigma<1\) and \(\Vert e(x^{k},\mu_{k})\Vert \geq0\). Hence, (3.2) is satisfied for some integer m. □
The following lemma shows that the halfspace \(H_{k}^{1}\) in Algorithm 3.1 strictly separates \(x^{k}\) from the solution set Γ if Γ is nonempty.
Lemma 3.3
If \(\Gamma\neq\emptyset\), then the halfspace \(H_{k}^{1}\) in Algorithm 3.1 separates the point \(x^{k}\) from the set Γ. Moreover,
Proof
By the definition of \(e(x^{k},\mu_{k})\) and Algorithm 3.1 we have
which can be rewritten as
Then, by this and by (3.2) we get
Hence, by the definition of \(H_{k}^{1}\) and by (3.7) we get \(x^{k}\notin H_{k}^{1}\).
On the other hand, for any \(x^{\ast} \in\Gamma\) and \(x\in C\), by the monotonicity of F we have
By the convexity of C it is easy to see that \(y^{k}\in C\). Letting \(x=y^{k}\) in (3.9), we have
which implies \(x^{\ast}\in H_{k}^{1}\). Moreover, it is easy to see that \(\Gamma\subseteq H_{k}^{1}\cap C\), \(\forall k\geq0\). □
The following lemma says that if the solution set is nonempty, then \(\Gamma\subset H_{k}^{1}\cap H_{k}^{2}\cap C\) and thus \(H_{k}^{1}\cap H_{k}^{2}\cap C\) is a nonempty set.
Lemma 3.4
If the solution set \(\Gamma\neq\emptyset\), then \(\Gamma\subset H_{k}^{1}\cap H_{k}^{2}\cap C\) for all \(k\geq 0\).
Proof
From the previous analysis it is sufficient to prove that \(\Gamma\subset H_{k}^{2}\) for all \(k\geq0\). The proof will be given by induction. Obviously, if \(k=0\), then
Now, suppose that
for \(k=l\geq0\). Then
For any \(x^{\ast}\in\Gamma\), by Lemma 2.1 and the fact that
we have that
Thus, \(\Gamma\subset H_{l+1}^{2}\). This shows that \(\Gamma\subset H_{k}^{2}\) for all \(k\geq0\), and the desired result follows.
For the case that the solution set is empty, we have that \(H_{l}^{1}\cap H_{l}^{2}\cap C\) is also nonempty from the following lemma, which implies the feasibility of Algorithm 3.1. □
Lemma 3.5
Suppose that \(\Gamma=\emptyset\). Then \(H_{l}^{1}\cap H_{l}^{2}\cap C\neq\emptyset\) for all \(k\geq 0\).
We next prove our main convergence result.
Theorem 3.1
Suppose the solution set Γ is nonempty. Then the sequence \(\{x^{k}\}\) generated by Algorithm 3.1 is bounded, and all its cluster points belong to the solution set. Moreover, the sequence \(\{x^{k}\}\) globally converges to a solution \(x^{\ast}\) such that \(x^{\ast}=P_{\Gamma}(x^{0})\).
Proof
Take an arbitrary point \(x^{\ast}\in\Gamma\). Then \(x^{\ast }\in H_{k}^{1}\cap H_{k}^{2}\cap C\). Since
by the definition of the projection we have that
So, \(\{x^{k}\}\) is a bounded sequence, and so is \(\{y^{k}\}\).
Since \(x^{k+1}\in H_{k}^{2}\), from the definition of the projection operator it is obvious that
For \(x^{k}\), by the definition of \(H_{k}^{2}\), for all \(z\in H_{k}^{2}\), we have
Obviously, \(x^{k}=P_{H_{k}^{2}}(x^{0})\). Thus, using Lemma 2.1, we have
that is,
which can be written as
Thus, the sequence \(\{\Vert x^{k}x^{0}\Vert \}\) is nondecreasing and bounded and hence convergent, which implies that
On the other hand, by \(x^{k+1}\in H_{k}^{1}\) we get
Since
by (3.10) we have
which implies
Using the CauchySchwarz inequality and (3.2), we obtain
By Lemma 3.1,
Since \(\mu_{k}=\min\{\theta\eta_{k1},1\}\), we have \(\mu_{k}\leq1\). Hence,
Therefore,
Taking into account that \(\eta_{k}=\gamma ^{m_{k}}\mu_{k}\leq\mu_{k}\), we further obtain
Since F is continuous, there exists a constant M such that \(\Vert F(y^{k})\Vert \leq M\). By (3.9) and (3.12) it follows that
For any convergent subsequence \(\{x^{k_{j}}\}\) of \(\{x^{k}\}\), its limit is denoted by x̄, that is,
Now, we consider the two possible cases for (3.13).
Suppose first that \(\{\eta_{k_{j}}\}\) has a limit. Then
By the choice of \(\eta_{k_{j}}\) in Algorithm 3.1 we know that
Since
Furthermore,
So, by (3.14) we obtain
Using the similar arguments of (3.2), we have
Since
from (3.15) we get that \(e(\bar{x})=0\), and thus x̄ is a solution of problem (1.1).
Suppose now that \(\limsup_{k\rightarrow\infty}\eta_{k}>0\). Because of (3.13), it must be that \(\liminf_{k\rightarrow\infty} \Vert e(x^{k})\Vert =0\). Since \(e(\cdot)\) is continuous, we get \(e(\bar{x})=0\), and thus x̄ is a solution of problem (1.1).
Now, we prove that the sequence \(\{x^{k}\}\) converges to a point contained in Γ.
Let \(x^{\ast}=P_{\Gamma}(x^{0})\). Since \(x^{\ast}\in\Gamma\), by Lemma 3.4 we have
for all j. So, by the iterative sequence of Algorithm 3.1 we have
Thus,
Letting \(j\rightarrow\infty\), we have
where the last inequality is due to Lemma 2.1 and the fact that \(x^{\ast}=P_{\Gamma}(x^{0})\) and \(\bar{x}\in\Gamma\). So,
Thus, the sequence \(\{x^{k}\}\) has a unique cluster point \(P_{\Gamma}(x^{0})\), which shows the global convergence of \(\{x^{k}\}\). □
Numerical experiments
To test the effectiveness of our algorithm in this paper, we implemented it in MATLAB to solve the following example. We use \(\Vert e(x^{k},\mu _{k})\Vert \leq\varepsilon=10^{5}\) as the stopping criterion. We denote Algorithm 3.1 in [14] as Algorithm 3.1^{∗}. Throughout the computational experiment, the parameters used in Algorithm 3.1 were set as \(\gamma=0.6\), \(\sigma=0.8\), \(\theta=1.5\), \(\eta_{0}=0.3\), \(\beta=1\). The numerical results of the example are given in Table 1, Iter denotes the number of iterations, CPU denotes the computing time, and \(x^{\ast}\) denotes the approximate solution.
Example
Let \(C = \{x \in\Re^{3}  x_{1}^{2}+x_{2}^{2} \leq 40\}\) and \(Q=\{x\in\Re^{3} x_{2}x_{3}^{2}\leq1\}\), \(A=I\). Find \(x\in C \) with \(Ax\in Q\).
From the numerical experiments for the simple example we can see that our proposed method has good convergence properties.
Some concluding remarks
This paper presented a hybrid CQ projection algorithm with two projection steps and one Armijo linearsearch step for solving the split feasibility problem (SFP). Different from the selfadaptive projection methods proposed by Zhang et al. [18], we use a new linersearch rule, which ensures that the hyperplane \(H_{k}\) separates the current \(x^{k}\) from the solution set Γ. The next iteration is generated by the projection of the starting point on a shrinking projection region (the intersection of three sets). Preliminary numerical experiments demonstrate a good behavior. However, whether the idea can be used to solve multipleset SFP deserves further research.
References
 1.
Censor, Y, Elfving, T: A multiprojection algorithm using Bregman projections in a product space. Numer. Algorithms 8, 221239 (1994)
 2.
Byne, C: An unified treatment of some iterative algorithms in signal processing and image reconstruction. Inverse Probl. 20, 103120 (2004)
 3.
Censor, Y, Motova, A, Segal, A: Perturbed projections and subgradient projections for the multiplesets split feasibility problem. J. Math. Anal. Appl. 327, 12441256 (2007)
 4.
Censor, T, Elfving, T, Kopf, N, Bortfeld, T: The multiplesets split feasibility problem and its applications for inverse problems. Inverse Probl. 21, 20712084 (2005)
 5.
Dang, Y, Gao, Y: Biextrapolated subgradient projection algorithm for solving multiplesets split feasibility problem. Appl. Math. J. Chin. Univ. Ser. B 29(3), 283294 (2014)
 6.
Qu, B, Xiu, N: A new halfspacerelaxation projection method for the split feasibility problem. Linear Algebra Appl. 428, 12181229 (2008)
 7.
Dang, Y, Xue, ZH: Iterative process for solving a multipleset split feasibility problem. J. Inequal. Appl. (2015). doi:10.1186/s1366001505769
 8.
Byrne, C: Iterative oblique projection onto convex sets and the split feasibility problem. Inverse Probl. 18, 441453 (2002)
 9.
Xu, H: A variable Krasnosel’skiMann algorithm and the multipleset split feasibility problem. Inverse Probl. 22, 20212034 (2006)
 10.
Dang, Y, Gao, Y: The strong convergence of a KMCQlike algorithm for split feasibility problem. Inverse Probl. (2011). doi:10.1088/02665611/27/1/015007
 11.
Yan, AL, Wang, GY, Xiu, NH: Robust solutions of split feasibility problem with uncertain linear operator. J. Ind. Manag. Optim. 3, 749761 (2007)
 12.
Yang, Q: The relaxed CQ algorithm solving the split feasibility problem. Inverse Probl. 20, 12611266 (2004)
 13.
Zhao, J, Yang, Q: Several solution methods for the split feasibility problem. Inverse Probl. 21, 17911799 (2005)
 14.
Qu, B, Xiu, NH: A note on the CQ algorithm for the split feasibility problem. Inverse Probl. 21, 16551665 (2005)
 15.
Ansari, QH, Rehan, A: Split feasibility and fixed point problems. In: Nonlinear Analysis: Approximation Theory, Optimization and Applications, pp. 281322. Springer, New York (2014)
 16.
Latif, A, Sahu, DR, Ansari, QH: Variable KMlike algorithms for fixed point problems and split feasibility problems. Fixed Point Theory Appl. 2014, Article ID 211 (2014)
 17.
He, B, Yang, H, Meng, Q, Han, D: Modified GoldsteinLevitinPolyak projection method for asymmetric strong monotone variational inequalities. J. Optim. Theory Appl. 112, 129143 (2002)
 18.
Zhang, WX, Han, D, Li, ZB: A selfadaptive projection method for solving the multiplesets split feasibility problem. Inverse Probl. 25, 115001 (2009)
 19.
Zhao, JL, Yang, Q: Selfadaptive projection methods for the multiplesets split feasibility problem. Inverse Probl. 27, 035009 (2011)
 20.
Nakajo, K, Takahashi, W: Strong convergence theorems for nonexpansive mappings and nonexpansive semigroups. J. Math. Anal. Appl. 279(2), 372379 (2003)
 21.
Kamimura, S, Takahashi, W: Strong convergence of a proximaltype algorithm in a Banach space. SIAM J. Optim. 13(3), 938945 (2002)
 22.
MartinezYanes, C, Xu, HK: Strong convergence of the CQ method for fixed point iteration processes. Nonlinear Anal., Theory Methods Appl. 64(11), 24002411 (2006)
 23.
Takahashi, W, Takeuchi, Y, Kubota, R: Strong convergence theorems by hybrid methods for families of nonexpansive mappings in Hilbert spaces. J. Math. Anal. Appl. 341, 276286 (2008)
 24.
Poom, K: A hybrid approximation method for equilibrium and fixed point problems for a monotone mapping and a nonexpansive mapping. Nonlinear Anal. Hybrid Syst. 2, 12451255 (2008)
Acknowledgements
This work was supported by Natural Science Foundation of Shanghai (14ZR1429200) and Innovation Program of Shanghai Municipal Education Commission (15ZZ074).
Author information
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors’ contributions
All authors contributed equally to the writing of this paper. All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Cite this article
Dang, Y., Xue, Z. & Wang, B. Hybrid CQ projection algorithm with linesearch process for the split feasibility problem. J Inequal Appl 2016, 106 (2016). https://doi.org/10.1186/s1366001610397
Received:
Accepted:
Published:
MSC
 47H05
 47J05
 47J25
Keywords
 split feasible problem
 Armijotype linesearch technique
 projection algorithm
 convergence