In this paper, the convergence analysis of the conventional conjugate Gradient method was reviewed. And the convergence analysis of the modified conjugate Gradient method was analysed with our extension on preconditioning the algorithm. Convergence of the algorithm is a function of the condition number of M-1A. Again, this work broadens our understanding that the modified CGM yields the exacts result after n-iterations, and further proves that the CGM algorithm is quicker if there are duplicated eigenvalues. Given infinite floating point precision, the number of iterations required to compute an exact solution is at most the number of distinct eigenvalues. It was discovered that the modified CGM algorithm converges more quickly when eigenvalues are clustered together than when they are irregularly distributed between a given interval. The effectiveness of a preconditioner is determined by the condition number of the matrix and occasionally by its clustering of eigenvalues. For large scale application, CGM should always be used with a pre-conditioner to improve convergence.KEYWORDS: Convergence, Conjugate Gradient, eigenvalue, preconditioning.