I study now is a negative data, positive and negative value or in the zero value of the distribution of neutral equilibrium (normal existence),
Existing convolution network seems it doesn't work out,
If anyone has done similar research? Hope to get some experience and advice,
At least Relu transformation can't be used on the feeling, because the negative to kill, directly to the still have to change back into the Sigmoid or tanh transformation,
CodePudding user response:
ReLU ReLU actually doesn't matter, right on your data unification and let it become a constant is it information and no losses,