SoFunction
Updated on 2024-10-29

Tensorboard function use in python neural network Pytorch

Installation of required libraries

A lot of people asked how Pytorch was going to be visualized, so I decided to get one up.

tensorboardX==2.0
tensorflow==1.13.2

Since tensorboard was originally used inside tensorflow, you need to install a tensorflow. it will come with a tensorboard.

You can also use pytorch's own Tensorboard without installing tensorboardX. The import method is as follows:

from  import SummaryWriter

However, since I have some bugs with the Tensorboard that comes with pytorch, I'm using tensorboardX to write this blog.

Common Function Functions

1、SummaryWriter()

This function is used to create a tensorboard file with the common parameters:

log_dir: path where the tensorboard file is stored flush_secs: indicates the time interval between writes to the tensorboard file

The call is made as follows:

writer = SummaryWriter(log_dir='logs',flush_secs=60)

2、writer.add_graph()

This function is used to create Graphs in the tensorboard, which holds the network structure, where the common parameters are:

model: pytorch model

input_to_model: input to pytorch model

Graphs are shown below:

The call is made as follows:

if Cuda:
    graph_inputs = torch.from_numpy((1,3,input_shape[0],input_shape[1])).type().cuda()
else:
    graph_inputs = torch.from_numpy((1,3,input_shape[0],input_shape[1])).type()
writer.add_graph(model, (graph_inputs,))

3、writer.add_scalar()

This function is used to add a loss to the tensorboard, where the common arguments are:

  • tag: tag, as shown below for Train_loss
  • scalar_value: the value of the tag
  • global_step: x-axis coordinate of the label

The call is made as follows:

writer.add_scalar('Train_loss', loss, (epoch*epoch_size + iteration))

4、tensorboard --logdir=

After completing the generation of the tensorboard file, the file can be called from the command line, tensorboard URL. The specific code is as follows:

tensorboard --logdir=D:\Study\Collection\Tensorboard-pytorch\logs

sample code (computing)

import torch
from  import Variable
import  as functional
from tensorboardX import SummaryWriter
import  as plt
import numpy as np
# The shape of x is (100,1)
x = torch.from_numpy((-1,1,100).reshape([100,1])).type()
# y has a shape of (100,1)
y = (x) + 0.2*(())
class Net():
    def __init__(self, n_feature, n_hidden, n_output):
        super(Net, self).__init__()
        # Applies a linear transformation to the incoming data: :math:y = xA^T + b
        # Fully-connected layer with equation y = xA^T + b
         = (n_feature, n_hidden)
         = (n_hidden, n_output)
    def forward(self, x):
        # Output of the implicit layer
        hidden_layer = ((x))
        output_layer = (hidden_layer)
        return output_layer
# Class creation
net = Net(n_feature=1, n_hidden=10, n_output=1)
writer = SummaryWriter('logs')
graph_inputs = torch.from_numpy((2,1)).type()
writer.add_graph(net, (graph_inputs,))
# It's the optimizer module
optimizer = ((), lr=1e-3)
# Mean square deviation loss
loss_func = () 
for t in range(1000):
    prediction = net(x)
    loss = loss_func(prediction, y)
    # Reverse pass steps
    # 1. Initialize the gradient
    optimizer.zero_grad()
    # 2. Calculate the gradient
    ()
    # 3. Perform optimizer optimization
    ()
    writer.add_scalar('loss',loss, t)
()

The effect is as follows:

Above is the detailed content of the use of Tensorboard function in python neural network Pytorch, more information about Pytorch Tensorboard function please pay attention to my other related articles!