.BatchNorm1d()
1、BatchNorm1d(num_features, eps = 1e-05, momentum=0.1, affine=True)
BN for 2d or 3d inputs. During training, the layer calculates the mean and variance of each input and moves it in parallel. The default momentum of the moving average is 0.1. During verification, the mean/variance obtained by training will be used to standardize the verification data.
num_features: represents the number of features input. The size of the expected input is 'batch_size x num_features [x width]'
Shape: - Input: (N, C) or (N, C, L) - Output: (N, C) or (N, C, L) (the same input and output)
2. BatchNorm2d (same as above)
BN the 4d input composed of 3d data.
num_features: The number of features from the expected input, the size of which is 'batch_size x num_features x height x width'
Shape: - Input: (N, C, H, W) - Output: (N, C, H, W) (The input and output are the same)
3. BatchNorm3d (same as above)
BN on the 5d input composed of 4d data.
The above detailed explanation of the use of batch normalize in pytorch is all the content I share with you. I hope you can give you a reference and I hope you can support me more.