Home > Mobile >  Can I define a method as an attribute?
Can I define a method as an attribute?

Time:11-25

The topic above is a bit ambiguous, the explaination is below:

class Trainer:
    """Object used to facilitate training."""

    def __init__(
        self,
        # params: Namespace,
        params,
        model,
        device=torch.device("cpu"),
        optimizer=None,
        scheduler=None,
        wandb_run=None,
        early_stopping: callbacks.EarlyStopping = None,
    ):
        # Set params
        self.params = params
        self.model = model
        self.device = device

        # self.optimizer = optimizer
        self.optimizer = self.get_optimizer()
        self.scheduler = scheduler
        self.wandb_run = wandb_run
        self.early_stopping = early_stopping

        # list to contain various train metrics
        # TODO: how to add more metrics? wandb log too. Maybe save to model artifacts?

        self.history = DefaultDict(list)

    @staticmethod
    def get_optimizer(
        model: models.CustomNeuralNet,
        optimizer_params: global_params.OptimizerParams(),
    ):
        """Get the optimizer for the model.

        Args:
            model (models.CustomNeuralNet): [description]
            optimizer_params (global_params.OptimizerParams): [description]

        Returns:
            [type]: [description]
        """
        return getattr(torch.optim, optimizer_params.optimizer_name)(
            model.parameters(), **optimizer_params.optimizer_params
        )

Notice that initially I passed in optimizer in the constructor, where I will be calling it outside this class. However, I now put get_optimizer inside the class itself (for consistency purpose, but unsure if it is ok). So, should I still define self.optimizer = self.get_optimizer() or just use self.get_optimizer() at the designated places in the class? The former encourages some readability for me.


Addendum: I now put the instance inside the .fit() method where I will call say 5 times to train the model 5 times. In this scenario, even though there won't be any obvious issue as we are using optimizer once per call, will it still be better to not define self.optimizer here?

    def fit(
        self,
        train_loader: torch.utils.data.DataLoader,
        valid_loader: torch.utils.data.DataLoader,
        fold: int = None,
    ):
        """[summary]

        Args:
            train_loader (torch.utils.data.DataLoader): [description]
            val_loader (torch.utils.data.DataLoader): [description]
            fold (int, optional): [description]. Defaults to None.

        Returns:
            [type]: [description]
        """
        self.optimizer = self.get_optimizer(
            model=self.model, optimizer_params=OPTIMIZER_PARAMS
        )
        self.scheduler = self.get_scheduler(
            optimizer=self.optimizer, scheduler_params=SCHEDULER_PARAMS
        )

CodePudding user response:

There is a difference between the two: calling your get_optimizer will instantiate a new torch.optim.<optimizer> every time. In contrast, setting self.optimizer and accessing it numerous times later will only create a single optimizer instance.

  • Related