I am trying to multiply as column major and I can't seem to find the right formula! I want to have the matrices as 1D.
Let's say I have these matrices:
A=
1 3
2 4
and B=
5 2 1
6 3 7
The above matrices are assumed that are stored already in column major order.
I am trying:
int main(int argc, const char* argv[]) {
int rows=2;
int cols=3;
int A[rows*rows];
int B[rows*cols];
int res[rows*cols];
A[0]=1;
A[1]=3;
A[2]=2;
A[3]=4;
B[0]=5;
B[1]=2;
B[2]=1;
B[3]=6;
B[4]=3;
B[5]=7;
/*A[0]=1;
A[1]=2;
A[2]=3;
A[3]=4;
B[0]=5;
B[1]=6;
B[2]=2;
B[3]=3;
B[4]=1;
B[5]=7;
*/
//multiplication as column major
for (int i=0;i<rows;i++){
for (int j=0;j<cols;j++){
res[i+j*rows]=0;
for (int k=0;k<rows;k++){
res[i+j*rows]+=A[i+k*rows]*B[k+j*cols];
}
}
}
for (int i=0;i<rows*cols;i++){
printf("\n\nB[%d]=%d\t",i,res[i]);
}
return 0;
}
I am not getting the correct results.
Also,I can't understand (in the case where the matrices are stored in column major already) ,how to index the matrices A and B.
A[0]=1;
A[1]=3;
...
or
A[0]=1;
A[1]=2;
...
I don't want to transpose the matrices and then use row major.
I want to handle the data as column major.
Because the indices ,if stored as column major,will be different (hence,will matter in order to do the multiplication).