Essentially, I want to propagate data through a Keras model, without first training the Keras model. I tried using both predict() and feeding in raw tensors into the model.
The data is a 2D Numpy float64 array with shape (3, 3), filled entirely with zeros.
The model itself is outlined below:
inputs = keras.Input(shape=(3,), batch_size=1)
FFNNlayer1 = keras.layers.Dense(100, activation='relu')(inputs)
FFNNlayer2 = keras.layers.Dense(100, activation='relu')(FFNNlayer1)
numericalOutput = keras.layers.Dense(3, activation='sigmoid')(FFNNlayer2)
categoricalOutput = keras.layers.Dense(9, activation='softmax')(FFNNlayer2)
outputs = keras.layers.concatenate([numericalOutput, categoricalOutput])
hyperparameters = keras.Model(inputs=inputs, outputs=outputs, name="hyperparameters")
hyperparameters.summary()
The model needed two different activation functions in it's output layer, hence why I used Functional API.
I first attempted to use hyperparameter.predict(data[0])
, but kept getting the following error:
WARNING:tensorflow:Model was constructed with shape (1, 3) for input KerasTensor(type_spec=TensorSpec(shape=(1, 3), dtype=tf.float32, name='input_15'), name='input_15', description="created by layer 'input_15'"), but it was called on an input with incompatible shape (None,).
Traceback (most recent call last):
File "<ipython-input-144-4c4a629eaefa>", line 1, in <module>
mainNet.hyperparameters.predict([dataset_info[0]])
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "C:\Users\hudso\AppData\Roaming\Python\Python38\site-packages\tensorflow\python\framework\func_graph.py", line 1129, in autograph_handler
raise e.ag_error_metadata.to_exception(e)
ValueError: in user code:
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\engine\training.py", line 1621, in predict_function *
return step_function(self, iterator)
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\engine\training.py", line 1611, in step_function **
outputs = model.distribute_strategy.run(run_step, args=(data,))
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\engine\training.py", line 1604, in run_step **
outputs = model.predict_step(data)
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\engine\training.py", line 1572, in predict_step
return self(x, training=False)
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\engine\input_spec.py", line 227, in assert_input_compatibility
raise ValueError(f'Input {input_index} of layer "{layer_name}" '
ValueError: Exception encountered when calling layer "hyperparameters" (type Functional).
Input 0 of layer "dense_20" is incompatible with the layer: expected min_ndim=2, found ndim=1. Full shape received: (None,)
Call arguments received:
• inputs=('tf.Tensor(shape=(None,), dtype=float32)',)
• training=False
• mask=None
I fiddled around with array dimensions a bit, but the model continued to give the same error. I then tried feeding raw tensors into the model, with the following code:
tensorflow_dataset_info = tf.data.Dataset.from_tensor_slices([dataset_info[0]]).batch(1)
aaaaa = enumerate(tensorflow_dataset_info)
predictions = mainNet.hyperparameters(aaaaa)
This code continued to give the following error:
Traceback (most recent call last):
File "<ipython-input-143-df51fe8fd203>", line 1, in <module>
hyperparameters = mainNet.hyperparameters(enumerate(tensorflow_dataset_info))
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "C:\Users\hudso\anaconda3\lib\site-packages\keras\engine\input_spec.py", line 196, in assert_input_compatibility
raise TypeError(f'Inputs to a layer should be tensors. Got: {x}')
TypeError: Inputs to a layer should be tensors. Got: <enumerate object at 0x000001F60081EA40>
I've looked online for a while, and I've searched through the tf.data documentation, but I'm still not sure how to fix this. Again, I've tried multiple variations of this code, and I continue to get mostly the same errors.
CodePudding user response:
If data.shape = (3, 3)
, when you pass data[0]
to model.predict()
, you are actually sending a vector of shape (3, )
, but your model is expecting shape (1, 3)
which means 1 example of size 3.
Try slicing your data instead:
model.predict(data[:1])
This way your tensor will have shape (1, 3).