Home > database >  Why use lambda layers instead of just plain code?
Why use lambda layers instead of just plain code?

Time:10-20

For example:

inp = keras.Input(shape=s)
D = keras.layers.Dense(d)(inp)
L = keras.layers.Lambda(lambda x: myFunc(x))(D)
outp = keras.layers.Dense(...)(L)
model = keras.model(inp,outp)

vs.

inp = keras.Input(shape=s)
D = keras.layers.Dense(d)(inp)
L = myFunc(D)
outp = keras.layers.Dense(...)(L)
model = keras.model(inp,outp)

where myFunc is made up of tensorflow functions.

The latter version seems to work just fine. Is there any special value in wrapping stuff as layers? I understand it might be required when there are learnable parameters, but I would use a custom layer in such a case, not a lambda.

CodePudding user response:

I don't think functionally there are too many incentives for using tf.keras.layers.Lambda. But there's a few I can think of.

Readability

I like to use them mainly for consistency and descript-ability. I find it a but messy when layers used with raw TF operations altogether.

For example, I can easily do,

tf.keras.layers.Lambda(lambda x: do_cool(x), name="cool_thing")

which will reflect in the model.summary()

Masking

With Lambda layer you can pass in the mask parameter to handle masking, whereas with TF ops, your code will become easily messy for masking.

Important notes

  • It's not advisable to use Lambda layer for operations involving tf.Variable objects. They won't be passed as trainable variables in your model.
  • Lambda ops are stateless. So for any stateful operations, Lambda is a no go.
  • I'm not 100% sure about this one, but I assume TF does some implicit wrapping of TF operations as Lambda layers anyway when you use them in a model (going with the TFOpLambda I see when printing model.summary()).
  • Related