深度学习常用激活函数之— Sigmoid & ReLU & Softmax

现象
1 , 卷积层池化层增加, strides=2, padding='same':Test accuracy: 0.5981%
2 每个卷积层后加 BatchNormalization()层:Test accuracy: 0.3589%
3 Dense层增加激活函数 softmax:Test accuracy: 2.2727%
深度学习常用激活函数之— Sigmoid & ReLU & Softmax
现象
1 , 卷积层池化层增加, strides=2, padding='same':Test accuracy: 0.5981%
2 每个卷积层后加 BatchNormalization()层:Test accuracy: 0.3589%
3 Dense层增加激活函数 softmax:Test accuracy: 2.2727%