In the sequel, we will use following results.
Lemma 2.1 [29]
Suppose that \{{\rho}_{n}\}, \{{\sigma}_{n}\} are two sequences of nonnegative numbers such that, for some real number {N}_{0}\ge 1,
{\rho}_{n+1}\le {\rho}_{n}+{\sigma}_{n}
for all n\ge {N}_{0}. Then we have the following:

(1)
If \sum {\sigma}_{n}<\mathrm{\infty}, then lim{\rho}_{n} exists.

(2)
If \sum {\sigma}_{n}<\mathrm{\infty} and \{{\rho}_{n}\} has a subsequence converging to zero, then lim{\rho}_{n}=0.
Lemma 2.2 [31]
For all x,y\in H and \lambda \in [0,1], the following wellknown identity holds:
{\parallel (1\lambda )x+\lambda y\parallel}^{2}=(1\lambda ){\parallel x\parallel}^{2}+\lambda {\parallel y\parallel}^{2}\lambda (1\lambda ){\parallel xy\parallel}^{2}.
Now, we prove our main results.
Lemma 2.3 Let H be a Hilbert space. Then, for all x,{x}_{i}\in H, i=1,2,\dots ,k,
\begin{array}{rcl}{\parallel \alpha x+\sum _{i=1}^{k}{\beta}^{i}{x}_{i}\parallel}^{2}& =& \alpha {\parallel x\parallel}^{2}+\sum _{i=1}^{k}{\beta}^{i}{\parallel {x}_{i}\parallel}^{2}\sum _{i=1}^{k}\alpha {\beta}^{i}{\parallel {x}_{i}x\parallel}^{2}\\ \underset{i\ne j}{\overset{k}{\sum _{i,j=1}}}{\beta}^{i}{\beta}^{j}{\parallel {x}_{i}{x}_{j}\parallel}^{2},\end{array}
(2.1)
where \alpha ,{\beta}^{i}\in [0,1], i=1,2,\dots ,k, and \alpha +{\sum}_{i=1}^{k}{\beta}^{i}=1.
Proof For any {x}_{i}\in H, i=1,2,\dots ,k, it can be easily seen that
{\parallel \sum _{i=1}^{k}{x}_{i}\parallel}^{2}=\sum _{i=1}^{k}{\parallel {x}_{i}\parallel}^{2}+2\underset{i\ne j}{\overset{k}{\sum _{i,j=1}}}Re\u3008{x}_{i},{x}_{j}\u3009.
(2.2)
Consider the following:
\begin{array}{rcl}{\parallel \alpha x+\sum _{i=1}^{k}{\beta}^{i}{x}_{i}\parallel}^{2}& =& {\parallel (1\sum _{i=1}^{k}{\beta}^{i})x+\sum _{i=1}^{k}{\beta}^{i}{x}_{i}\parallel}^{2}\\ =& {\parallel x+\sum _{i=1}^{k}{\beta}^{i}({x}_{i}x)\parallel}^{2}\\ =& {\parallel x\parallel}^{2}+\sum _{i=1}^{k}{\beta}^{{i}^{2}}{\parallel {x}_{i}x\parallel}^{2}+2\sum _{i=1}^{k}{\beta}^{i}Re\u3008{x}_{i}x,x\u3009\\ +2\underset{i\ne j}{\overset{k}{\sum _{i,j=1}}}{\beta}^{i}{\beta}^{j}Re\u3008{x}_{i}x,{x}_{j}x\u3009.\end{array}
(2.3)
For all i,j=1,2,\dots ,k, we have
2Re\u3008{x}_{i}x,x\u3009={\parallel {x}_{i}\parallel}^{2}{\parallel {x}_{i}x\parallel}^{2}{\parallel x\parallel}^{2},
(2.4)
and
2Re\u3008{x}_{i}x,{x}_{j}x\u3009={\parallel {x}_{i}{x}_{j}\parallel}^{2}+{\parallel {x}_{i}x\parallel}^{2}+{\parallel {x}_{j}x\parallel}^{2}.
(2.5)
Substituting (2.4) and (2.5) in (2.3), we get
This completes the proof. □
Remark 2.1 Lemma 2.2 is now the special case of our result.
Theorem 2.1 Let K be a compact convex subset of a real Hilbert space H and {T}_{i}:K\to K, i=1,2,\dots ,k, be a family of continuous hemicontractive mappings. Let {\alpha}_{n},{\beta}_{n}^{i}\in [0,1] be such that {\alpha}_{n}+{\sum}_{i=1}^{k}{\beta}_{n}^{i}=1 and satisfying \{{\alpha}_{n}\},{\beta}_{n}^{i}\subset [\delta ,1\delta ] for some \delta \in (0,1), i=1,2,\dots ,k.
Then, for arbitrary {x}_{0}\in K, the sequence \{{x}_{n}\} defined by (1.9) converges strongly to a common fixed point in {\bigcap}_{i=1}^{k}F({T}_{i})\ne \mathrm{\varnothing}.
Proof Let {x}^{\ast}\in {\bigcap}_{i=1}^{k}F({T}_{i}). Using the fact that {T}_{i}, i=1,2,\dots ,k are hemicontractive, we obtain
{\parallel {T}_{i}{x}_{n}{x}^{\ast}\parallel}^{2}\le {\parallel {x}_{n}{x}^{\ast}\parallel}^{2}+{\parallel {x}_{n}{T}_{i}{x}_{n}\parallel}^{2}.
(2.6)
With the help of (1.9), Lemma 2.3 and (2.6), we obtain the following estimates:
\begin{array}{rcl}{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}& =& {\parallel {\alpha}_{n}{x}_{n1}+\sum _{i=1}^{k}{\beta}_{n}^{i}{T}_{i}{x}_{n}{x}^{\ast}\parallel}^{2}\\ =& {\parallel {\alpha}_{n}({x}_{n1}{x}^{\ast})+\sum _{i=1}^{k}{\beta}_{n}^{i}({T}_{i}{x}_{n}{x}^{\ast})\parallel}^{2}\\ =& {\alpha}_{n}{\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}+\sum _{i=1}^{k}{\beta}_{n}^{i}{\parallel {T}_{i}{x}_{n}{x}^{\ast}\parallel}^{2}\sum _{i=1}^{k}{\alpha}_{n}{\beta}_{n}^{i}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}\\ \underset{i\ne j}{\overset{k}{\sum _{i,j=1}}}{\beta}_{n}^{i}{\beta}_{n}^{j}{\parallel {T}_{i}{x}_{n}{T}_{j}{x}_{n}\parallel}^{2}\\ \le & {\alpha}_{n}{\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}+\sum _{i=1}^{k}{\beta}_{n}^{i}{\parallel {T}_{i}{x}_{n}{x}^{\ast}\parallel}^{2}\sum _{i=1}^{k}{\alpha}_{n}{\beta}_{n}^{i}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}.\end{array}
(2.7)
Substituting (2.6) in (2.7), we get
Also, we have
\begin{array}{rcl}{\parallel {x}_{n}{T}_{i}{x}_{n}\parallel}^{2}& =& {\parallel {\alpha}_{n}{x}_{n1}+\sum _{i=1}^{k}{\beta}_{n}^{i}{T}_{i}{x}_{n}{T}_{i}{x}_{n}\parallel}^{2}\\ =& {\alpha}_{n}^{2}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}.\end{array}
(2.9)
Substituting (2.9) in (2.8), we get
{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}\le {\alpha}_{n}{\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}+\sum _{i=1}^{k}{\beta}_{n}^{i}{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}\sum _{i=1}^{k}{\alpha}_{n}(1{\alpha}_{n}){\beta}_{n}^{i}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2},
which implies that
{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}\le {\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}\sum _{i=1}^{k}(1{\alpha}_{n}){\beta}_{n}^{i}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}.
Thus, from the condition \{{\alpha}_{n}\},{\beta}_{n}^{i}\subset [\delta ,1\delta ] for some \delta \in (0,1), i=1,2,\dots ,k, we obtain
{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}\le {\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}\delta (1\delta )\sum _{i=1}^{k}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}
(2.10)
for all fixed points {x}^{\ast}\in {\bigcap}_{i=1}^{k}F({T}_{i}). Moreover, we have
\delta (1\delta )\sum _{i=1}^{k}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}\le {\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}{\parallel {x}_{n}{x}^{\ast}\parallel}^{2},
and thus, for all i=1,2,\dots ,k,
\begin{array}{rcl}\delta (1\delta )\sum _{j=1}^{\mathrm{\infty}}{\parallel {x}_{j1}{T}_{i}{x}_{j}\parallel}^{2}& \le & \sum _{j=1}^{\mathrm{\infty}}({\parallel {x}_{j1}{x}^{\ast}\parallel}^{2}{\parallel {x}_{j}{x}^{\ast}\parallel}^{2})\\ =& {\parallel {x}_{0}{x}^{\ast}\parallel}^{2}.\end{array}
Hence, for all i=1,2,\dots ,k, we obtain
\sum _{j=1}^{\mathrm{\infty}}{\parallel {x}_{j1}{T}_{i}{x}_{j}\parallel}^{2}<\mathrm{\infty}
(2.11)
for each i=1,2,\dots ,k, which implies that
\underset{n\to \mathrm{\infty}}{lim}\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel =0
for each i=1,2,\dots ,k. From (2.9), it further implies that
\underset{n\to \mathrm{\infty}}{lim}\parallel {x}_{n}{T}_{i}{x}_{n}\parallel =0.
By the compactness of K, this immediately implies that there is a subsequence \{{x}_{{n}_{j}}\} of \{{x}_{n}\} which converges to a common fixed point of {\bigcap}_{i=1}^{k}F({T}_{i}), say {y}^{\ast}. Since (2.10) holds for all fixed points of {\bigcap}_{i=1}^{k}F({T}_{i}), we have
{\parallel {x}_{n}{y}^{\ast}\parallel}^{2}\le {\parallel {x}_{n1}{y}^{\ast}\parallel}^{2}\delta (1\delta )\sum _{i=1}^{k}{\beta}_{n}^{i}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}
and, in view of (2.11) and Lemma 2.1, we conclude that \parallel {x}_{n}{y}^{\ast}\parallel \to 0 as n\to \mathrm{\infty}, that is, {x}_{n}\to {y}^{\ast} as n\to \mathrm{\infty}. This completes the proof. □
Theorem 2.2 Let H, K, {T}_{i}, i=1,2,\dots ,k, be as in Theorem 2.1 and {\alpha}_{n},{\beta}_{n}^{i}\in [0,1] be such that {\alpha}_{n}+{\sum}_{i=1}^{k}{\beta}_{n}^{i}=1 and satisfying \{{\alpha}_{n}\},{\beta}_{n}^{i}\subset [\delta ,1\delta ] for some \delta \in (0,1), i=1,2,\dots ,k.
If {P}_{K}:H\to K is the projection operator of H onto K, then the sequence \{{x}_{n}\} defined iteratively by
{x}_{n}={P}_{K}({\alpha}_{n}{x}_{n1}+\sum _{i=1}^{k}{\beta}_{n}^{i}{T}_{i}{x}_{n})
for each n\ge 0 converges strongly to a common fixed point in {\bigcap}_{i=1}^{k}F({T}_{i})\ne \mathrm{\varnothing}.
Proof The mapping {P}_{K} is nonexpansive (see [2]) and K is a Chebyshev subset of H and so {P}_{K} is a singlevalued mapping. Hence, we have the following estimate:
\begin{array}{rcl}{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}& =& {\parallel {P}_{K}({\alpha}_{n}{x}_{n1}+\sum _{i=1}^{k}{\beta}_{n}^{i}{T}_{i}{x}_{n}){P}_{K}{x}^{\ast}\parallel}^{2}\\ \le & {\parallel {\alpha}_{n}{x}_{n1}+\sum _{i=1}^{k}{\beta}_{n}^{i}{T}_{i}{x}_{n}{x}^{\ast}\parallel}^{2}\\ =& {\parallel {\alpha}_{n}({x}_{n1}{x}^{\ast})+\sum _{i=1}^{k}{\beta}_{n}^{i}({T}_{i}{x}_{n}{x}^{\ast})\parallel}^{2}\\ \le & {\alpha}_{n}{\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}+\sum _{i=1}^{k}{\beta}_{n}^{i}{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}\\ \sum _{i=1}^{k}{\alpha}_{n}(1{\alpha}_{n}){\beta}_{n}^{i}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2},\end{array}
which implies that
{\parallel {x}_{n}{x}^{\ast}\parallel}^{2}\le {\parallel {x}_{n1}{x}^{\ast}\parallel}^{2}\sum _{i=1}^{k}(1{\alpha}_{n}){\beta}_{n}^{i}{\parallel {x}_{n1}{T}_{i}{x}_{n}\parallel}^{2}.
The set K\cup T(K) is compact and so the sequence \{\parallel {x}_{n}{T}_{i}{x}_{n}\parallel \} is bounded. The rest of the argument follows exactly as in the proof of Theorem 2.1. This completes the proof. □
Theorem 2.3 Let K be a compact convex subset of a real Hilbert space H and {T}_{i}:K\to K, i=1,2,\dots ,k, be a family of Lipschitz hemicontractive mappings. Let {\alpha}_{n},{\beta}_{n}^{i}\in [0,1] be such that {\alpha}_{n}+{\sum}_{i=1}^{k}{\beta}_{n}^{i}=1 and satisfying \{{\alpha}_{n}\},{\beta}_{n}^{i}\subset [\delta ,1\delta ] for some \delta \in (0,1), i=1,2,\dots ,k.
Then, for arbitrary {x}_{0}\in K, the sequence \{{x}_{n}\} defined by (1.9) converges strongly to a common fixed point in {\bigcap}_{i=1}^{k}F({T}_{i})\ne \mathrm{\varnothing}.
Theorem 2.4 Let H, K, {T}_{i}, i=1,2,\dots ,k, be as in Theorem 2.3 and {\alpha}_{n},{\beta}_{n}^{i}\in [0,1] be such that {\alpha}_{n}+{\sum}_{i=1}^{k}{\beta}_{n}^{i}=1 and satisfying \{{\alpha}_{n}\},{\beta}_{n}^{i}\subset [\delta ,1\delta ] for some \delta \in (0,1), i=1,2,\dots ,k.
If {P}_{K}:H\to K is the projection operator of H onto K, then the sequence \{{x}_{n}\} defined iteratively by
{x}_{n}={P}_{K}({\alpha}_{n}{x}_{n1}+\sum _{i=1}^{k}{\beta}_{n}^{i}{T}_{i}{x}_{n})
for each n\ge 1 converges strongly to a common fixed point in {\bigcap}_{i=1}^{k}F({T}_{i})\ne \mathrm{\varnothing}.
Example For k=2, we can choose the following control parameters: {\alpha}_{n}=\frac{1}{4}\frac{1}{{(n+2)}^{2}}, {\beta}_{n}^{1}=\frac{1}{4} and {\beta}_{n}^{2}=\frac{1}{2}+\frac{1}{{(n+2)}^{2}}.