The code is supposed to take inputs to form a 3x3 Matrix and then multiply each term by the diagonal element of that line, but, for some reason that i don't know, it multiplies two times by the diagonal when the column index is bigger than the row index.
#include <stdio.h>
#define R 3
int a[R][R], i, j;
int main(void) {
for (i = 0; i < R; i ) {
for (j = 0; j < R; j ) {
printf("\nInsira o n%i%i ", i, j);
scanf("%i", &a[i][j]);
}
}
for (i = 0; i < R; i ) {
for (j = 0; j < R; j ) {
a[i][j] = a[i][j] * a[i][i];
}
}
for (i = 0; i < R; i ) {
printf("\n");
for (j = 0; j < R; j ) {
printf("%i ", a[i][j]);
}
}
}
input:
9 8 7
6 5 4
3 2 1
output:
81 648 567
30 25 100
3 2 1
CodePudding user response:
The diagonal value for a given row is being changed before that row has been fully multiplied, so once the column goes past the diagonal, the multiplies are using the new value of that diagonal rather than the old value.
You can fix it (and improve the speed) as follows:
for (i = 0; i < R; i ) {
int tmp = a[i][i];
for (j = 0; j < R; j ) {
a[i][j] *= tmp;
}
}
Also, as mentioned, both i
and j
should be local variables.