.**
.** is an inherited class. The object must be constructed before use and then called. If you use it directly, you will get an error
For example
a = (3,4) print(a) sigmoid = () a = sigmoid(a) print(a) a = (a)
tensor([[ 0.2462, -2.1680, -1.4064, -0.0268], [-0.4800, -0.4670, 1.7318, 0.3498], [ 0.0137, -2.1080, -0.0825, -0.1350]]) tensor([[0.5612, 0.1027, 0.1968, 0.4933], [0.3823, 0.3853, 0.8496, 0.5866], [0.5034, 0.1083, 0.4794, 0.4663]]) Traceback (most recent call last): Traceback (most recent call last): File "C:\file\Llama\", line 8, in <module> a = (a) ^^^^^^^^^^^^^^^^^^^ File "C:\Users\90929\AppData\Local\conda\conda\envs\lce\Lib\site-packages\torch\nn\modules\", line 485, in __init__ raise TypeError( TypeError: Sigmoid.__init__() takes 1 positional argument but 2 were given
.**
.** is a pure mathematical function that can be used directly
a = (3,4) print(a) a = (a) print(a)
tensor([[-0.1516, 0.5398, 0.3226, -0.4956], [-0.2250, 0.6393, 0.4432, 0.4215], [-0.5741, 0.0689, 0.3078, -1.5994]]) tensor([[0.4622, 0.6318, 0.5799, 0.3786], [0.4440, 0.6546, 0.6090, 0.6039], [0.3603, 0.5172, 0.5763, 0.1681]])
the difference
The differences in PyTorch are: 1. Different inheritance methods; 2. Different training parameters; 3. Different implementation methods; 4. Different calling methods.
1. Different inheritance methods
Most of the modules in it are implemented by inheriting classes. These modules are Python classes and need to be instantiated before they can be used. The functions in , are called directly without instantiation.
2. Different training parameters
The modules in it can contain trainable parameters, and all trainable parameters can be obtained using the() method to optimize the training of the algorithm. The functions in the function have no training parameters.
3. Different implementation methods
The modules in it are implemented based on an object-oriented method, while the functions in it are implemented based on functional programming. Therefore, it is more convenient to perform functions combination, multiplexing and other operations using it, and is more suitable for defining stateful modules.
4. Different calls
The modules in it are called through instances of the class. Usually, you need to create a model instance first, and then pass the input data into the model for forward calculation. The functions in it can be called directly, and you only need to pass the input data into the function to perform forward calculations.
In short, and are both modules used to build neural network models, but their implementation methods, calling methods, trainingable parameters, etc. are different. When using it, you need to select appropriate modules and functions according to specific needs.
connect
The class will call a function in the forward() method, so it can be understood that the method in the nn module is a higher-level encapsulation of the methods in the module.
How to choose
1. When to choose
The nn module is recommended when defining the layer of deep neural networks.
First, when defining layers with variable parameters (such as conv2d, linear, batch_norm), the nn module will help us initialize good variables, and we only need to pass in some parameters;
Second, because the model class itself seems to be more coordinated and unified;
Third, because it can be combined.
Fourth, when using dropout, it is recommended to use the nn module, because dropout can be easily closed through the eval() method during the test stage.
2. When to choose
The functions in the system are more underlying than nn, so although they are not encapsulated, they have high transparency. You can define the functions you want based on them.
This is the end of this article about the difference between .** and .** in Pytorch. For more related contents of Pytorch, please search for my previous articles or continue browsing the related articles below. I hope you will support me in the future!