A Preconditioned ELMRES Implementation

Mehdi Delkhosh, Hossein Zareamoghaddam

  Open Access OPEN ACCESS  Peer Reviewed PEER-REVIEWED

A Preconditioned ELMRES Implementation

Mehdi Delkhosh1,, Hossein Zareamoghaddam1

1Department of Mathematics, Islamic Azad University, Bardaskan Branch, Bardaskan, Iran

Abstract

In this paper we review the ELMentary RESidual(ELMRES) algorithm for solving linear system of equations. ELMRES is a krylov subspace method which uses the Hessenberg transformation as the projection technique for reducing the dimension of original matrix A. We apply some preconditioned techniques for this algorithm. At the end of this paper, some numerical examples have been shown to compare the preconditioned ELMRES with the original version.

Cite this article:

  • Delkhosh, Mehdi, and Hossein Zareamoghaddam. "A Preconditioned ELMRES Implementation." Applied Mathematics and Physics 1.4 (2013): 90-93.
  • Delkhosh, M. , & Zareamoghaddam, H. (2013). A Preconditioned ELMRES Implementation. Applied Mathematics and Physics, 1(4), 90-93.
  • Delkhosh, Mehdi, and Hossein Zareamoghaddam. "A Preconditioned ELMRES Implementation." Applied Mathematics and Physics 1, no. 4 (2013): 90-93.

Import into BibTeX Import into EndNote Import into RefMan Import into RefWorks

1. Introduction

Many applications require the solution of several sparse systems of linear equations

(1)

with the coefficient matrix , the right hand side vector and the unknown solution vector .

There are several different iterative algorithms for computing the solution of (1). Among them, krylov subspace methods, like algorithms from [1, 3, 5, 6], are more popular solvers for this sort of equations. The krylov subspace is defined as

which is a practical subspace for modification of new approximations with where is the initial guess. ELMRES was proposed by G. Howell and D. Stephens [3] in 2000. The implementation of ELMRES is approximately similar to GMRES so that in both methods the original problem (1) is transferred to an upper Hessenberg least square problem with dimension which whilst GMRES uses the Arnoldi orthogonalization algorithm for reduction and ELMRES uses the Hessenberg decomposition algorithm. The construction of this paper is as follows: In section 2, the ELMRES is discussed. In section 3, the left and right preconditioner methods are described shortly and some popular applicable matrices for these methods which are also practical for ELMRES are shown. In the next section, the proposed preconditioned matrices are applied and then they are tested by some numerical examples.

2. Elmres

Here the ELMRES algorithm is described briefly. In ELMRES method a basis constructs for the krylov subspace by the Hessenberg algorithm. Suppose . Now, the procedure for extending the dimension of the subspace is shown in the following algorithm [3].

2.1. Algorithm 1: Hessenberg Decomposition Algorithm

1. vector and set and .

2. for

1. 

2. 

End

3. 

In the algorithm 1, it is shown that the step of the for loop zeros the entry of where . At the end of algorithm, the entry of becomes . In this case, . If the partial pivoting is employed, the relation

(2)

is concluded,(see [3]).

An elementary similarity transformation is of the form

(3)

where is a column vector with zeros as its first entries and is a vector with its only nonzero entry in the position.

Without pivoting condition, reduction by elementary similarity transformations to upper hessenberg form is accomplished by

(4)

which form the structures of s and upper hessenberg matrix we have

By definition of and the following relation is obtained

(5)

By using this important relation, we reach to the reduced problem

The approximate solution is where .

Wilkinson [8] suggested Hessenberg reduction with implicit pivoting avoids the breakdown in decomposition process. Here, the ELMRES with implicit pivoting [3] is shown.

2.2. Algorithm 2: ELMRES with Implicit Pivoting

1. Choose , set

set so that .

2. For , until satisfied do

,

For ,

,

End

,

For do

,

,

End

set so that .

,

.

End

Choose to minimize .

3. From the approximate solution .

For ,

,

End

For more information about ELMRES and its properties see [3].

3. Preconditioned ELMRES

The structure of GMRES is approximately similar to ELMRES. So, some prediction GMRES techniques may be useful for ELMRES. Here, GMRES with right preconditioners are shown and some suitable matrices for preconditioned ELMRES are suggested.

Let is a preconditioned matrix. In left preconditioned GMRES, the original equation is replaced with and the orthogonalization procedure is run so that a normalized basis vectors is obtained for the preconditioned krylov subspace

by the Arnoldi algorithm. So, we have to apply GMRES algorithm just for the new linear system of equations.

Furthermore, the right preconditioned GMRES algorithm is based on solving

(6)

As the right preconditioned GMRES is more practical, its algorithm is written below [6].

2.3. Algorithm 3: Right Preconditioned GMRES

1. Choose , set.

2. For

,

For

do , ..,

End

,

If set and goto 3,

End

3. and .

From the above algorithm, the vector is not computed explicitly and the Arnoldi process generates a basis vector set for the preconditioned subspace

.

In this case, the residual norm is now relative to the initial system (1).

Gauss-Seidel and SOR are two popular classical iterative methods for solving (1) while using the matrices regard to the structure of these methods as some preconditioners are popular techniques for GMRES. Now, we apply the preconditioned matrices and obtained from Gauss-Seidel and SOR respectively. Suppose where , and are the elements lower than the diagonal, the diagonal elements and the elements upper than the diagonal of matrix respectively. So, and are defined as follows [6],

(7)

4. Numerical Tests

The In this section, some well-known numerical examples are applied to test the effect of preconditioned techniques on the convergence speed of ELMRES. For simplicity, is chosen as the initial guess in these experiments. These algorithms are implemented in Matlab 7.3 on a Pentium 4 PC with 1 GB of RAM. The stopping criterion is used for all implementations.

Example 1: For construction of linear system of equations (1), Let be an even integer and denote by and , respectively, the identity and zero matrices. Define also the matrices and as in

Finally, construct the non-symmetric matrix as in

The matrix is obtained by first discretizing the Poisson equation

for

with Neumann boundary conditions

on a uniform grid of mesh size via central differences, and then by taking the unknowns in the red-black ordering. This problem was also considered by Saberi Najafi and Zareamoghaddam in [7] for testing another GMRES implementation. Note that is singular with a 1D null space spanned by the vector .

In our numerical experiments, for the first exam we took which the corresponding square matrix is in order . Table1shows the numerical results of different ELMRES implementations applied for the linear problem with order .

Table 1. Numerical results of example1

Example 2: There are many different numerical examples for system of linear equations which among them, "Regularization Tools" of Prof. Hansen [2] is one popular package for ill-posed linear system of equations. In this example, the main source code of one discrete image deblurring problem is selected from this package [2]. The Matlab code "blure" with detailed parameters , and , i.e. is used to generate the square matrix with the right hand side vector . This problem has been contaminated by blur and noise. Numerical results of this test have been illustrated in Table 2.

5. Conclusion

Elementary residual method is rather a new krylov subspace method for solving linear system of equations. In this paper, some right preconditioned matrices have been tested for ELMRES to solve different linear system of equations and the results show that the convergence of ELMRES has also been influenced by some appropriate preconditioning matrices.

Acknowledgement

Hereby, the authors want to thank of Islamic Azad University, Bardaskan Branch because of the financial support.

References

[1]  Freund R., Nachtigal N., QMR: A Quasi-Minimal Residual Method for Non-Hermitian Linear Systems, Numer. Math. 60(1991) 315-339.
In article      
 
[2]  Hansen P. C., Regularization Tools Version 4.0 for Matlab 7.3, Numerical Algorithms, 46 (2007) 189-194.
In article      
 
[3]  Howell G., Stephens D., ELMRES: an oblique projection solver for sparse systems of linear equations, technical report, Florida Institute of Technology, 2000.
In article      
 
[4]  Lewis B., Reichel L., Arnoldi-Tikhonov regularization methods, J. Comput. Appl. Math., 226 (2009) 92-102.
In article      
 
[5]  Paige C. C., Saunders M. A., LSQR: an algorithm for sparse linear equations and sparse least squares, ACM Trans. Math. Software 8 (1982) 43-71.
In article      
 
[6]  Saad Y., Iterative method for sparse linear systems. PWS publishing company, a division of international Thomson publishing Inc, U.S.A, 1996.
In article      
 
[7]  Saberi Najafi H., Zareamoghaddam H., A new computational GMRES method, Appl. Math. Comput. 2 (2008) 527-534.
In article      CrossRef
 
[8]  Wilkinson J. H., the Algebraic Eigenvalue Problem. Clarendon Press, Oxford, England, 1965.
In article      
 
  • CiteULikeCiteULike
  • MendeleyMendeley
  • StumbleUponStumbleUpon
  • Add to DeliciousDelicious
  • FacebookFacebook
  • TwitterTwitter
  • LinkedInLinkedIn