Initializing model parameters in pytorch manually

Issue

I am creating a separate class to initializer model and adding layers in a list, but those layers are not being added to parameters, plz tell how to add them to parameters() of model.

class Mnist_Net(nn.Module):
def __init__(self,input_dim,output_dim,hidden_layers=2,neurons=128):
    super().__init__()
    layers = []
    for i in range(hidden_layers):
        if len(layers) == 0:
            layers.append(nn.Linear(input_dim,neurons))
        if i == hidden_layers-1:
            layers.append(nn.Linear(layers[-2].weight.shape[0],output_dim))
        layers.append(nn.Linear(layers[i-1].weight.shape[0],neurons))
    self.layers= layers

When I print model.parameters()

model = Mnist_Net(28*28,10,neurons=56)
   for t in model.parameters():
   print(t)

it shows nothing, but when I add layers in class like

self.layer1 = nn.Linear(input_dim,neurons)

It shows one layer in parameters.Plz tell How can I add all layers in self.layers in model.parameters()

Solution

To be registered in the parent module, your submodules should be nn.Modules themselves. In your case, you should wrap layers with nn.ModuleList:

  self.layers = nn.ModuleList(layers)

Then, your layers will be registered:

>>> model = Mnist_Net(28*28,10, neurons=56)

>>> for t in model.parameters():
...    print(t.shape)
torch.Size([56, 784])
torch.Size([56])
torch.Size([56, 56])
torch.Size([56])
torch.Size([10, 56])
torch.Size([10])
torch.Size([56, 56])
torch.Size([56])

Answered By – Ivan

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published