I'm trying to load model from Hugging Face and I downloaded h5 model from here: https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/tree/main
from flask import Flask, jsonify, request # import objects from the Flask model
from keras.models import load_model
from transformers import AutoTokenizer, AutoModelForSequenceClassification,TextClassificationPipeline
model = load_model('./tf_model.h5') # trying to load model here
And the error shows up:
File "C:\D\Learning\Flask\flask-pp-rest\main.py", line 11, in <module>
model = load_model('./tf_model.h5') File "C:\Users\ndrez\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\saving\save.py",
line 200, in load_model
return hdf5_format.load_model_from_hdf5(filepath, custom_objects, File
"C:\Users\ndrez\AppData\Local\Programs\Python\Python39\lib\site-packages\keras\saving\hdf5_format.py",
line 176, in load_model_from_hdf5
raise ValueError('No model found in config file.') ValueError: **No model found in config file.**
Does anyone know how to solve this? If you know please help me out. I will monitor this question and try to implement your solution's answer.
CodePudding user response:
To load the model you specified, this is the code:
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
model = AutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
CodePudding user response:
You can load the tensorflow version of distilbert-base-uncased-finetuned-sst-2-english
with the TFAutoModelForSequenceClassification class:
from transformers import AutoTokenizer, TFAutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
model = TFAutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")