Home > front end >  Load estimator from model artifact in s3 bucket aws
Load estimator from model artifact in s3 bucket aws

Time:08-08

I have used estimator for a pytorch model and have saved the artifacts in s3 bucket. using below code

estimator = PyTorch(
    entry_point="train_deploy.py",
    source_dir="code",
    role=role,
    framework_version="1.3.1",
    py_version="py3",
    instance_count=1,  # this script only support distributed training for GPU instances.
    instance_type="ml.m5.12xlarge",
    output_path=output_path,
    hyperparameters={
        "epochs": 1,
        "num_labels": 7,
        "backend": "gloo",
    },
    disable_profiler=False, # disable debugger
)
estimator.fit({"training": inputs_train, "testing": inputs_test})

The model works well and there are no issues with it. However i would like to re use this model later for inference, how do i do that. i am looking for something like below

estimator = PyTorch.load(input_path = "<xyz>")

CodePudding user response:

I was able to solve this by the following steps

model_data=output_path
from sagemaker.pytorch.model import PyTorchModel 

pytorch_model = PyTorchModel(model_data=model_data,
                             role=role,
                             framework_version="1.3.1",
                             source_dir="code",
                             py_version="py3",
                             entry_point="train_deploy.py")

predictor = pytorch_model.deploy(initial_instance_count=1, instance_type="ml.m4.2xlarge")
predictor.serializer = sagemaker.serializers.JSONSerializer()
predictor.deserializer = sagemaker.deserializers.JSONDeserializer()
result = predictor.predict("<text that needs to be predicted>")
print("predicted class: ", np.argmax(result, axis=1))
  • Related