1

I wrote naive gauss elimination without pivoting:

function [x] = NaiveGaussianElimination(A, b)
    N = length(b);
    x = zeros(N,1);
    mulDivOp = 0;
    subAddOp = 0;
    for column=1:(N-1)
        for row = (column+1):N
            mul = A(row,column)/A(column,column);
            A(row,:) = A(row,:)-mul*A(column,:);
            b(row) = b(row)-mul*b(column);
            mulDivOp = mulDivOp+N-column+2; 
            subAddOp = subAddOp +N-column+1;
        end
    end

    for row=N:-1:1
        x(row) = b(row);
        for i=(row+1):N
            x(row) = x(row)-A(row,i)*x(i);
        end
        x(row) = x(row)/A(row,row);
        mulDivOp = mulDivOp + N-row + 1;
        subAddOp = subAddOp + N-row;
    end
    x = x';
    mulDivOp
    subAddOp
    return
end

but I am curious if I can reduce the number of multiplications/divisions and additions/subtractions in case I know which elements of matrix are 0:

For N = 10:

A =

    96   118     0     0     0     0     0     0     0    63
   154   -31  -258     0     0     0     0     0     0     0
     0  -168   257  -216     0     0     0     0     0     0
     0     0   202    24   308     0     0     0     0     0
     0     0     0  -262   -36  -244     0     0     0     0
     0     0     0     0   287  -308   171     0     0     0
     0     0     0     0     0   197   229  -258     0     0
     0     0     0     0     0     0   -62  -149   186     0
     0     0     0     0     0     0     0   -43   255  -198
  -147     0     0     0     0     0     0     0  -147  -220

(non-zero values are from randi). In general, non-zero elements are a_{1, N}, a_{N,1} and a_{i,j} when abs(i-j) <= 1.

Lempek
  • 31
  • 2
  • maybe this helps? http://stackoverflow.com/questions/9028701/gaussian-elimination-in-matlab – lhcgeneva Dec 01 '14 at 15:06
  • @lhcgeneva thanks for link, but I read this question and answers, but I have a task to implement on my own naive gauss elimination algorithm and then figure out how to reduce the number of operations in this special case. So I implemented gauss and now I'm trying to solve the second part. – Lempek Dec 01 '14 at 16:53

1 Answers1

0

Probably not. There are nice algorithms for reducing tridiagonal matrices (which these aren't, but they are close) to diagonal matrices. Indeed, this is one way in which the SVD of a matrix is produced, using orthogonal similarity transformations, not Gaussian elimination.

The problem is that when you use Gaussian elimination to remove the nonzero entries in the first column, you will have introduced additional nonzero entries in the other columns. The further you proceed, the more you destroy the structure of the matrix. It may be that Gaussian elimination is simply the wrong approach for the problem you are trying to solve, at least if you are trying to exploit the structure of the matrix.

Jeremy West
  • 11,495
  • 1
  • 18
  • 25