 Research
 Open Access
Viscosity iterative algorithm for the zero point of monotone mappings in Banach spaces
 Yan Tang^{1}Email authorView ORCID ID profile
https://doi.org/10.1186/s1366001818451
© The Author(s) 2018
 Received: 6 February 2018
 Accepted: 9 September 2018
 Published: 21 September 2018
Abstract
Inspired by the work of Zegeye (J. Math. Anal. Appl. 343:663–671, 2008) and the recent papers of Chidume et al. (Fixed Point Theory Appl. 2016:97, 2016; Br. J. Math. Comput. Sci. 18:1–14, 2016), we devise a viscosity iterative algorithm without involving the resolvent operator for approximating the zero of a monotone mapping in the setting of uniformly convex Banach spaces. Under concise parameter conditions we establish strong convergence of the proposed algorithm. Moreover, applications to constrained convex minimization problems and solution of Hammerstein integral equations are included. Finally, the performances and computational examples and a comparison with related algorithms are presented to illustrate the efficiency and applicability of our new algorithm.
Keywords
 Monotone mapping
 Zero point
 Viscosity approximation
 Strong convergence
MSC
 47H04
 46N10
 47H06
 47J25
1 Introduction
This is a successful and powerful algorithm in finding a solution of the equation \(0\in Au\) and after that, it was extended by many authors (see, e.g., Rockafellar [10], Chidume [11], Xu [12], Tang [13], Qin et al. [14]).
On the other hand, Browder [15] introduced an operator \(T:H\rightarrow H\) by \(T=IA\) where I is the identity mapping on a Hilbert space H. The operator T is called pseudocontractive and the zeros of monotone operator A, if they exist, correspond to the fixed points of T. Therefore the approximation of the solutions of \(Au=0\) reduces to the approximation of the fixed points of a pseudocontractive mapping.
Since the normalized duality map J is the identity map I in Hilbert spaces, and so, under the idea of Browder [15], the approximating to solution of \(0\in Au\) has been extended to normed spaces by numerous authors (see, for instance, Chidume [17, 18], Agarwal et al. [19], Reich [20], Diop [21], and the references therein), where A is a monotone mapping from E to itself.
Although the above results have better theoretical properties, such as, but not only, weak and strong convergence to a solution of the equation \(0\in Au\), there are still some difficulties to overcome. For instance, the generalized technique of converting the zero of A into the fixed point of T in Browder [15] is not applicable since, in this case when A is monotone, A maps E into \(E^{*}\). In addition, the resolvent technique in Martinet [9] is not convenient to use because one has to compute the inverse of \((I+\lambda A)\) at each step of the iteration process.
Hence, it is only natural to ask the following question.
Question 1.1
Can we construct an algorithm without involving the resolvent operator to approximate a zero point of A in Banach spaces?
Motivated and inspired by the work of Martinet [9], Rockafellar [10], Zegeye [1], and Chidume et al. [2, 3], as well as Ibaraki and Takahashi [22], we wish to provide an affirmative answer to the question. Our contribution in the present work is a new viscosity iterative method for the solutions of the equation \(0 \in AJu\), that is, \(Ju\in A^{1}(0)\), where \(A: E^{*} \rightarrow{E}\) is a monotone operator defined on the dual of a Banach space E and \(J:E\rightarrow E^{*}\) is the normalized duality map.
The outline of the paper is as follows. In Sect. 2, we collect definitions and results which are needed for our further analysis. In Sect. 3, our implicit and explicit algorithms without involving a resolvent operator are introduced and analyzed, the strong convergence to a zero of the composed mapping AJ under concise parameters conditions is obtained. In addition, the main result is applied to the convex optimization problems and the solution of Hammerstein equation. Finally, some numerical experiments and a comparison with related algorithms are given to illustrate the performances of our new algorithms.
2 Preliminaries
In the sequel, we shall need the following definitions and results.
 (i)
J is a monotone operator;
 (ii)
if E is smooth, then J is singlevalued;
 (iii)
if E is reflexive, then J is onto;
 (iv)
if E is uniformly smooth, then J is uniformly continuous on bounded subsets of E.
Lemma 2.1
Let α be a real number, and \((x_{0},x_{1},\ldots)\in l^{\infty}\) such that \(\mu(x_{n})\leq\alpha\) for all Banach limits. If \(\limsup_{n\rightarrow\infty}(x_{n+1}x_{n})\leq0\), then \(\limsup_{n\rightarrow\infty}x_{n}\leq\alpha\).
Lemma 2.2
(see, e.g., Tan and Xu [28], Osilike and Aniagbosor [30])
 (i)
\(\lim_{n\rightarrow\infty}\theta_{n}=0\), \(\sum_{n=1}^{\infty}\theta_{n}=\infty\);
 (ii)
\(\lim_{n\rightarrow\infty}\frac{\sigma_{n}}{\theta_{n}}\leq0\) or \(\sum_{n=0}^{\infty}\sigma_{n}<\infty\).
Lemma 2.3
(see, e.g., Xu [12])
Lemma 2.4
(Zegeye [1])
3 Main results
We now show the strong convergence of our implicit and explicit algorithms.
Theorem 3.1
Proof
Since E is 2uniformly smooth, from Alber [16, 31], we have that J is \(L_{*}\)Lipschitz continuous, noticing that A is LLipschitz continuous, therefore \(IAJ\) is Lipschitz continuous with constant \(1+LL_{*}\).
First, we show that \(x_{t}\) is welldefined. Since \(\lim_{t\rightarrow0}\frac{\omega(t)}{t}=0\), for \(\forall \varepsilon>0\), there exists \(\delta>0\) such that, for all \(t\in(0,\delta)\), the inequality \(\frac{\omega (t)}{t}<\varepsilon\) holds.
 (C1):

\(\lim_{n\rightarrow\infty}\alpha_{n}=0\), \(\sum_{n=1}^{\infty}\alpha_{n}=\infty\); \(\lim_{n\rightarrow\infty}\frac{\omega_{n}}{\alpha_{n}}=0\) and \(\sum_{n=0}^{\infty}\omega_{n}<\infty\);
 (C2):

f is a piecewise function: \(f(x^{*})=x^{*}\) if \(x^{*}\in(AJ)^{1}(0)\); otherwise \(f(x^{*})\) is a contractive function with coefficient ρ.
Theorem 3.2
Proof
According to the definition of f, it is obvious that if \(x_{n}\in (AJ)^{1}(0) \) then we stop the iteration. Otherwise, we set \(n:=n+1\) and return to iterative step (3.2).
The proof includes three steps.
Step 1: First we prove that \(\{x_{n}\}\) is bounded. Since \(\alpha _{n}\rightarrow0\) and \(\lim_{n\rightarrow\infty}\frac{\omega_{n}}{\alpha _{n}}=0\) as \(n\rightarrow\infty\), there exists \(N_{0}>0\) such that \(\alpha _{n}\leq\frac{1}{6}\), \(\frac{\omega_{n}}{\alpha_{n}}\leq\frac{1}{6LL_{*}}\), \(\forall n>N_{0}\). We take \(x^{*}\in(AJ)^{1}(0)\) or \(Jx^{*}\in A^{1}(0)\). Let \(r>0\) be sufficiently large such that \(x_{N_{0}}\in B_{r}(x^{*})\) and \(f(x_{N_{0}})\in B_{\frac{r}{6}}(x^{*})\).
We show that \(\{x_{n}\}\) belongs to \(B:=\overline{B_{r}(x^{*})}\) for all integers \(n\geq N_{0}\). First, it is clear by construction that \(x_{N_{0}}\in B\). Assuming now that, for an arbitrary \(n>N_{0}\), \(x_{n}\in B\), we prove that \(x_{n+1} \in B\).
This is contradiction. Consequently, we can get that \(\{x_{n}\}\) belongs to B for all integers \(n\geq N_{0}\), which implies that the sequence \(\{x_{n}\}\) is bounded, so are the sequences \(\{f(x_{n})\}\) and \(\{AJx_{n}\}\).
Step 2: We show that \(\lim_{n\rightarrow\infty}\sup\langle zf(x_{n}),j(zx_{n+1})\rangle \leq0\), where \(z\in C_{\min}\cap(AJ)^{1}(0)\).
Step 3: Next we show that \(\x_{n+1}z\ \rightarrow 0\).
Theorem 3.3
Proof
Similar to the proof in Theorem 3.2, we can obtain that the sequences \(\{x_{n}\}\) and \(\{AJx_{n}\}\) are bounded. Furthermore, we have that \(\lim_{n\rightarrow\infty}\sup\langle x_{n}z,j(x_{n+1}z)\rangle\leq0\), where \(z\in C_{\min}\cap (AJ)^{1}(0) \).
According to Zegeye [1] and Liu [32], for a mapping \(T:E\rightarrow E^{*}\), a point \(x^{*}\in E\) is called a Jfixed point of T if and only if \(Tx^{*}=Jx^{*}\) and T is called semipseudo if and only if \(A:=JT\) is monotone. We can observe that a zero point of A is the Jfixed point of a semipseudo mapping. If E is a Hilbert space, the semipseudo mapping and the Jfixed point coincide with a pseudocontractive mapping and a fixed point of pseudocontraction, respectively. In the case that the semipseudo mapping T is from \(E^{*}\) to E, we have that \(AJ:=(J^{1}T)J\) is monotone and the Jfixed point set is denoted by \(F_{J}(T)=\{x\in E, x=TJx\}\). We have the following corollaries for semipseudo mappings from \(E^{*}\) to E.
Corollary 3.4
Corollary 3.5
(Zegeye [1])
Proof
Take \(f(x)\equiv u\) in Theorem 3.2, the result is obtained. □
If we change the role of E and \(E^{*}\), then we shall obtain the following results.
Theorem 3.6
Theorem 3.7
(Zegeye [1])
We give below two examples in order to show that the conditions of explicit iterative Algorithm (3.2) are easily satisfied.
Example 1
 (1)
\(\lim_{n\rightarrow\infty}\alpha_{n}=0\), \(\sum_{n=1}^{\infty}\alpha_{n}=\infty\);
 (2)
\(\lim_{n\rightarrow\infty}\frac{\omega_{n}}{\alpha_{n}}=0\) and \(\sum_{n=1}^{\infty}\omega_{n}<\infty\).
Example 2
 (1)
\(\lim_{n\rightarrow\infty}\alpha_{n}=0\), \(\sum_{n=1}^{\infty}\alpha_{n}=\infty\);
 (2)
\(\lim_{n\rightarrow\infty}\frac{\omega_{n}}{\alpha_{n}}=0\) and \(\sum_{n=1}^{\infty}\omega_{n}<\infty\).
4 Applications
In this section, we consider the constrained convex minimization problems and the solution of Hammerstein integral equations as the applications of our main result which is proposed in Sect. 3.
4.1 Application to constrained convex minimization problems
Lemma 4.1
Proof
This completes the proof. □
Remark 4.2
From Lemma 4.1, the subdifferential ∂h is monotone, we can also get that \(T=J\partial h\) is a semipseudo mapping from E to \(E^{*}\).
Consequently, the following theorems are obtained.
Theorem 4.3
Theorem 4.4
4.2 Application to solution of Hammerstein integral equations
Theorem 4.5
Theorem 4.6
5 Numerical example
In the sequel, we give a numerical example to illustrate the applicability, effectiveness, efficiency, and stability of our viscosity iterative algorithm (VIA). We have written all the codes in Matlab R2016b and they are preformed on a LG dual core personal computer.
5.1 Numerical behavior of VIA
Example
Hence, A is \(a\)Lipschitz continuous monotone, J is 1Lipschitz continuous.
Two groups of consequences of parameters are tested here as follows:
Case I: \(\alpha_{n}=\frac{1}{(n+1)^{p}}\), \(\omega_{n}=\frac{1}{n(n+1)^{p}}\), \(p\in[1/8,1/4,1/3,1/2,1]\);
Case II: \(\alpha_{n}=\frac{1}{\ln^{p}(n+1)}\), \(\omega_{n}=\frac{1}{n\ln ^{p}(n+1)}\), \(p\in[1/8,1/4,1/3,1/2,1]\).
 (i)
\(\lim_{n\rightarrow\infty}\alpha_{n}=0\), \(\sum_{n=1}^{\infty}\alpha_{n}=\infty\);
 (ii)
\(\omega_{n}=o(\alpha_{n})\), \(\sum_{n=1}^{\infty}\omega_{n}<\infty\).
Algorithm (VIA) with different group of parameters
a  0.01  0.5  0.99  

p = 1  
Case I  No. Iterations  10  10  12 
CPU (time)  0.047  0.047  0.055  
Case II  No. Iterations  10  10  12 
CPU (time)  0.044  0.048  0.045  
p = 2  
Case I  No. of Iterations  10  10  12 
CPU (time)  0.056  0.058  0.062  
Case II  No. Iterations  10  10  12 
CPU (time)  0.068  0.055  0.052 
 (a)
The rate of \(D_{n}=10^{10}\times\x_{n+1}x_{n}\^{2}\) generated by our algorithm (VIA) depends strictly on the convergence rate of parameter \(\{\alpha_{n}\}\) and the Lipschitz coefficient of a continuous monotone operator.
 (b)
Our viscosity iterative algorithm (VIA) works well for parameter sequences of \(\{\alpha_{n}\}\) being fast convergent to 0 as \(n\rightarrow\infty\). In general, if \(D_{n}=\ x_{n+1}x_{n}\^{2}\), then the error of \(D_{n}\) can be obtained approximately equal to 10^{−16}. When \(D_{n}\) obtains to this error, then it becomes unstable. The best error of \(D_{n}\) can be obtained approximately equal to 10^{−30} when \(a=2\).
 (c)
For the second group parameter \(\{\alpha_{n}\}\) being slowly convergent to 0 as \(n\rightarrow\infty\), then \(D_{n}\) is slightly increasing in the early iterations, and after that, it is seen to be almost stable.
5.2 Comparison of VIA with other algorithms
Comparison between VIA and other algorithms with \(x_{0}=1\)
a  TOL  VIA  RM (u = 1)  GMIM  

Iter  CPU (s)  Iter  CPU (s)  Iter  CPU (s)  
1/4  10^{−8}  278  0.094  85  0.055  559  0.097 
10^{−10}  1290  0.14  325  0.070  3529  0.28  
1/3  10^{−8}  266  0.076  100  0.044  473  0.078 
10^{−10}  1233  0.14  380  0.088  2663  0.21  
1/2  10^{−8}  243  0.070  126  0.052  317  0.065 
10^{−10}  1123  0.13  472  0.077  1471  0.13  
3/2  10^{−8}  128  0.061  221  0.059  37  0.043 
10^{−10}  587  0.10  823  0.098  94  0.049 
From these tables, we can see that the RM is the best. The GMIM is the most timeconsuming, and the reasonable explanation is the fact that at each step the GMIM has no contractive parameters (coefficients) for obtaining the next step which can take lower convergence rate, while the convergence rate of the RM depends strictly on the previous constant u and the initial value \(x_{0}\). In comparing with other two methods, VIA seems to have competitive advantage. However, the main advantage of VIA is that the viscosity iterative algorithm works more stable than other methods and it is done in Banach spaces much more general than Hilbert spaces.
6 Conclusion
Let E be a nonempty closed uniformly convex and 2uniformly smooth Banach space with dual \(E^{*}\). We construct some implicit and explicit algorithms for solving the equation \(0\in AJu\) in the Banach space E, where \(A:E^{*}\rightarrow E\) is a monotone mapping and \(J:E\rightarrow E^{*}\) is the normalized duality map which plays an indispensable role in this research paper. The advantages of the algorithm are that the resolvent operator is not involved, which makes the iteration simple for computation; moreover, the zero point problem of monotone mappings is extended from Hilbert spaces to Banach spaces. The proposed algorithms converge strongly to a zero of the composed mapping AJ under concise parameter conditions. In addition, the main result is applied to approximate the minimizer of a proper convex function and the solution of Hammerstein integral equations. To some extent, our results extend and unify some results considered in Xu [12], Zegeye [1], Chidume and Idu [2], Chidume [3, 35], and Ibarakia and Takahashi [22].
Declarations
Acknowledgements
The author expresses their deep gratitude to the referee and the editor for his/her valuable comments and suggestions which helped tremendously in improving the quality of this paper and made it suitable for publication.
Funding
This article is funded by the National Science Foundation of China (11471059), the Science and Technology Research Project of Chongqing Municipal Education Commission (KJ1706154), and the Research Project of Chongqing Technology and Business University (KFJJ2017069).
Authors’ contributions
The author worked jointly in drafting and approving the final manuscript. The author read and approved the final manuscript.
Competing interests
The author declares that they have no competing interests.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Authors’ Affiliations
References
 Zegeye, H.: Strong convergence theorems for maximal monotone mappings in Banach spaces. J. Math. Anal. Appl. 343, 663–671 (2008) MathSciNetView ArticleGoogle Scholar
 Chidume, C.E., Kennedy, O.I.: Approximation of zeros of bounded maximal monotone mappings, solutions of Hammerstein integral equations and convex minimization problem. Fixed Point Theory Appl. 2016, 97 (2016) MathSciNetView ArticleGoogle Scholar
 Chidume, C.E., Romanus, O.M., Nnyaba, U.V.: A new iterative algorithm for zeros of generalized Phistrongly monotone and bounded maps with application. Br. J. Math. Comput. Sci. 18(1), 1–14 (2016) View ArticleGoogle Scholar
 Zarantonello, E.H.: Solving functional equations by contractive averaging. Tech. Rep. 160, US. Army Math, Madison, Wisconsin (1960) Google Scholar
 Minty, G.J.: Monotone (nonlinear) operators in Hilbert spaces. Duke Math. J. 29(4), 341–346 (1962) MathSciNetView ArticleGoogle Scholar
 Kac̆urovskii, R.I.: On monotone operators and convex functionals. Usp. Mat. Nauk 15(4), 213–215 (1960) Google Scholar
 Chidume, C.E.: An approximation method for monotone Lipschitz operators in Hilbert spaces. J. Aust. Math. Soc. Ser. A 41, 59–63 (1986) View ArticleGoogle Scholar
 Berinde, V.: Iterative Approximation of Fixed Points. Lecture Notes in Mathematics. Springer, London (2007) MATHGoogle Scholar
 Martinet, B.: Regularisation d’inequations variationnelles par approximations successives. Rev. Fr. Inform. Rech. Oper. 4, 154–158 (1970) MATHGoogle Scholar
 Rockafellar, R.T.: Monotone operators and the proximal point algorithm. SIAM J. Control Optim. 14, 877–898 (1976) MathSciNetView ArticleGoogle Scholar
 Chidume, C.E.: An approximation method for monotone Lipschitzian operators in Hilbertspaces. J. Aust. Math. Soc. Ser. A 41, 59–63 (1986) MathSciNetView ArticleGoogle Scholar
 Xu, H.K.: A regularization method for the proximal point algorithm. J. Glob. Optim. 36, 115–125 (2006) MathSciNetView ArticleGoogle Scholar
 Tang, Y.: Strong convergence of viscosity approximation methods for the fixedpoint of pseudocontractive and monotone mappings. Fixed Point Theory Appl. 2013, 273 (2013). https://doi.org/10.1186/168718122013273 MathSciNetView ArticleMATHGoogle Scholar
 Qin, X.L., Kang, S.M., Cho, Y.J.: Approximating zeros of monotone operators by proximal point algorithms. J. Glob. Optim. 46, Article ID 75 (2010) MathSciNetView ArticleGoogle Scholar
 Browder, F.E.: Nonlinear mappings of nonexpansive and accretivetype in Banach spaces. Bull. Am. Math. Soc. 73, 875–882 (1967) MathSciNetView ArticleGoogle Scholar
 Alber, Y.: Metric and generalized projection operators in Banach spaces: properties and applications. In: Kartsatos, A.G. (ed.) Theory and Applications of Nonlinear Operators of Accrective and Monotone Type, pp. 15–50. Dekker, New York (1996) Google Scholar
 Chidume, C.E.: Geometric Properties of Banach Spaces and Nonlinear Iterations. Lectures Notes in Mathematics. Springer, London (2009) MATHGoogle Scholar
 Chidume, C.E.: Iterative approximation of fixed points of Lipschitzian strictly pseudocontractive mappings. Proc. Am. Math. Soc. 99(2), 283–288 (1987) MathSciNetMATHGoogle Scholar
 Agarwal, R.P., Meehan, M., O’Regan, D.: Fixed Point Theory and Applications. Cambridge Tracts in Mathematics, vol. 141. Cambridge University Press, Cambridge (2001) View ArticleGoogle Scholar
 Reich, S.: A weak convergence theorem for alternating methods with Bergman distance. In: Kartsatos, A.G. (ed.) Theory and Applications of Nonlinear Operators of Accrective and Monotone Type. Lecture Notes in Pure and Appl. Math., vol. 178, pp. 313–318. Dekker, New York (1996) Google Scholar
 Diop, C., Sow, T.M.M., Djitte, N., Chidume, C.E.: Constructive techniques for zeros of monotone mappings in certain Banach space. SpringerPlus 4, 383 (2015) View ArticleGoogle Scholar
 Ibaraki, T., Takahashi, W.: A new projection and convergence theorems for the projections in Banach spaces. J. Approx. Theory 149(1), 1–14 (2007) MathSciNetView ArticleGoogle Scholar
 Cioranescu, I.: Geometry of Banach Spaces, Duality Mappings and Nonlinear Problems. Kluwer Academic, Dordrecht (1990) View ArticleGoogle Scholar
 Xu, Z.B., Roach, G.F.: Characteristic inequalities of uniformly convex and uniformly smooth Banach spaces. J. Math. Anal. Appl. 157, 189–210 (1991) MathSciNetView ArticleGoogle Scholar
 Xu, H.K.: Inequalities in Banach spaces with applications. Nonlinear Anal. 16(12), 1127–1138 (1991) MathSciNetView ArticleGoogle Scholar
 Zălinescu, C.: On uniformly convex functions. J. Math. Anal. Appl. 95, 344–374 (1983) MathSciNetView ArticleGoogle Scholar
 Kamimura, S., Takahashi, W.: Strong convergence of a proximaltype algorithm in Banach spaces. SIAM J. Optim. 13(3), 938–945 (2002) MathSciNetView ArticleGoogle Scholar
 Tan, K.K., Xu, H.K.: Approximating fixed points of nonexpansive mappings by Ishikawa iteration process. J. Math. Anal. Appl. 178(2), 301–308 (1993) MathSciNetView ArticleGoogle Scholar
 Takahashi, W.: Nonlinear Functional AnalysisFixed Point Theory and Its Applications. Yokohama Publishers, Yokohama (2000) MATHGoogle Scholar
 Osilike, M.O., Aniagbosor, S.C.: Weak and strong convergence theorems for fixed points of asymptotically nonexpansive mapping. Math. Comput. Model. 32(10), 1181–1191 (2000) MathSciNetView ArticleGoogle Scholar
 Alber, Y., Ryazantseva, I.: Nonlinear Ill Posed Problems of Monotone Type. Springer, London (2006) MATHGoogle Scholar
 Liu, B.: Fixed point of strong duality pseudocontractive mappings and applications. Abstr. Appl. Anal. 2012, Article ID 623625 (2012). https://doi.org/10.1155/2012/623625 MathSciNetView ArticleMATHGoogle Scholar
 Chidume, C.E., Zegeye, H.: Approximation of solutions of nonlinear equations of monotone and Hammersteintype. Appl. Anal. 82(8), 747–758 (2003) MathSciNetView ArticleGoogle Scholar
 Zegeye, H.: Iterative solution of nonlinear equations of Hammerstein type. J. Inequal. Pure Appl. Math. 4(5), Article ID 92 (2003) MathSciNetMATHGoogle Scholar
 Chidume, C.E., Idu, K.O.: Approximation of zeros of bounded maximal monotone mappings, solutions of Hammerstein integral equations and convex minimization problems. Fixed Point Theory Appl. 2016, 97 (2016). https://doi.org/10.1186/s1366301605828 MathSciNetView ArticleMATHGoogle Scholar