Home > Software engineering >  Class error 'str' object has no attribute
Class error 'str' object has no attribute

Time:02-11

I am getting the following error for my code:

Enter your first 3x3 matrix: 
    matrices.matrix_one("")
  File "C:\Users\ablev\eclipse-workspace\SDEV300Lab4\matrix.py", line 61, in matrix_one
    self.MATRIX1=[]
AttributeError: 'str' object has no attribute 'MATRIX1'

Program is supposed to ask the user to enter two matrices, then compute the addition of the two matrices. I am new to python and probably structuring the class wrong. Any help is appreciated. Thanks.

class matrices:
    
    def __init__(self,MATRIX1,MATRIX2):
        self.MATRIX1 = ''
        self.MATRIX2 = ''

    def matrix_one(self):
        """first matrix"""
        print("Enter your first 3x3 matrix: ")
        self.MATRIX1=[]
        for i in range(3):
            while True:
                row=input().split()
                row=list(map(int,row))
                if len(row) != 3:
                    print("Please enter 3 rows of 3 columms of\
    numbers seperated by a space: ")
                else:
                    break
            self.MATRIX1.append(row)
        print("Your first 3x3 matrix is: ")
        for i in range(3):
            for j in range(3):
                print(self.MATRIX1[i][j],end=" ")
            print()
    
    def matrix_two(self):
        """asking user for input of second matrix"""
        print("Enter your second 3x3 matrix: ")
        self.MATRIX2=[]
        for i in range(3):
            while True:
                row=input().split()
                row=list(map(int,row))
                if len(row) != 3:
                    print("Please enter 3 rows of 3 columms of\
    numbers seperated by a space: ")
                else:
                    break
            self.MATRIX2.append(row)
        print("Your second 3x3 matrix is: ")
        for i in range(3):
            for j in range(3):
                print(self.MATRIX2[i][j],end=" ")
            print()
    
    def add(self):
        '''function to subtract results of matrix'''
        results = np.add(self.MATRIX1,self.MATRIX2)
        return results

matrices.matrix_one("")
matrices.matrix_two("")

matrices.add("")

CodePudding user response:

By calling matrices.matrix_one("") you are passing an empty string as self to matrix_one.

You probably want to create an instance of your class instead, so that self refers to the class instance when you call the methods:

m = matrices()
m.matrix_one()
m.matrix_two()
m.add()
  • Related