Home > OS >  InaccessibleTensorError - When using `tf.keras.layers.Layer` output in loop condition of another lay
InaccessibleTensorError - When using `tf.keras.layers.Layer` output in loop condition of another lay

Time:11-25

When I use the output from a layer (tf.keras.layers.Layer) as the loop iterator in another layer, I am getting an InaccessibleTensorError,

InaccessibleTensorError: The tensor 'Tensor("looper/while/sub:0", shape=(None, 1), dtype=float32)' 
cannot be accessed here: it is defined in another function or code block. Use return values, 
explicit Python locals or TensorFlow collections to access it. Defined in: 
FuncGraph(name=looper_while_body_483, id=2098967820416); accessed from: 
FuncGraph(name=looper_scratch_graph, id=2098808987904).

Minimalistic code to reproduce the error,

import tensorflow as tf
import numpy as np

class Looper(tf.keras.layers.Layer):
    # custom layer
    def __init__(self, units, **kwargs):
        super(Looper, self).__init__(**kwargs)
        self.units = units

    def call(self, input):
        output = []
        while input > 0:
            input = input - 0.01
            output.append(input)
        return tf.stack(output, axis=1)

input_label = tf.keras.Input((1, 3))
lstm1 = tf.keras.layers.LSTM(1)
looper = Looper(10)
output = lstm1(input_label)
output = looper(output)

model = tf.keras.Model(input_label, output)
adam = tf.keras.optimizers.Adam(0.01)
model.compile(adam, 'mse')

I wouldn't able to locate any similar issues or questions in the Tensorflow issue tracker nor SO. Any help or insights is much appreciated :)

CodePudding user response:

I think the problem may lie in the python list you are using in the custom layer. You should use a Tensorflow collection like tf.TensorArray for your use case:

import tensorflow as tf
import numpy as np

class Looper(tf.keras.layers.Layer):
    # custom layer
    def __init__(self, units, **kwargs):
        super(Looper, self).__init__(**kwargs)
        self.units = units

    def call(self, input):
        output = tf.TensorArray(dtype=tf.float32, size=0, dynamic_size=True)
        while input > 0:
            input = input - 0.01
            output = output.write(output.size(), input)
        return output.stack()

input_label = tf.keras.Input((1, 3))
lstm1 = tf.keras.layers.LSTM(1)
looper = Looper(10)
output = lstm1(input_label)
output = looper(output)

model = tf.keras.Model(input_label, output)
adam = tf.keras.optimizers.Adam(0.01)
model.compile(adam, 'mse')

print(model(tf.random.normal((1, 1, 3))))
tf.Tensor(
[[[ 0.01392288]]

 [[ 0.00392288]]

 [[-0.00607712]]], shape=(3, 1, 1), dtype=float32)

Depending what you wan to do, you will probably have to reshape the output from the Looper layer.

  • Related