PyTorch - How to deactivate dropout in evaluation mode

后端 未结 3 1480
佛祖请我去吃肉
佛祖请我去吃肉 2020-12-31 05:12

This is the model I defined it is a simple lstm with 2 fully connect layers.

import copy
import torch
import torch.nn as nn
import torch.nn.functional as F
i         


        
3条回答
  •  星月不相逢
    2020-12-31 05:53

    As the other answers said, the dropout layer is desired to be defined in your model's __init__ method, so that your model can keep track of all information of each pre-defined layer. When the model's state is changed, it would notify all layers and do some relevant work. For instance, while calling model.eval() your model would deactivate the dropout layers but directly pass all activations. In general, if you wanna deactivate your dropout layers, you'd better define the dropout layers in __init__ method using nn.Dropout module.

提交回复
热议问题