For some reason I need to use y_train (the target) in my model (not only in loss function), but I didn't find a way to get it.
I get my training dataset like this:
train_ds = DataGenerator("train", args).fetch()
<PrefetchDataset shapes: ((2, None), (2, 4000, 22)), types: (tf.float32, tf.float32)>
The second part (2,4000,22) is the target. Then I fit the model:
history = model.fit(train_ds, validation_data=valid_ds, callbacks=callbacks,
batch_size=args.batch_size, epochs=args.max_epoch)
I know how to get the target separately outside the model. But inside the model I don't know how. Is it possible?
CodePudding user response:
https://stanford.edu/~shervine/blog/keras-how-to-generate-data-on-the-fly
The DataGenerator class contains X,y as variables and batches. Internally the getitem_(self, index): returns the X,y batches for the model. The getitem_ calls the _data_generation(self, list_IDs_temp) method which generators the X,y data.
training_generator = DataGenerator(partition['train'], labels,
**params)
validation_generator = DataGenerator(partition['validation'], labels,
**params)
model.fit_generator(generator=training_generator,
validation_data=validation_generator,
use_multiprocessing=True,
workers=2)
CodePudding user response:
Hi guys I just find some ways to solve this.
Use tf.concat() to concatenate the data in y_train to x_train and send them together into the model, them separate them. But the dimension of both thing should be the same. (My input and target don't have the same size so I didn't try this.)
Use two inputs like this:
model.fit((x_train,y_train),y_train)
Hope this can help somehow :)