automatic-differentiation

What does the parameter retain_graph mean in the Variable's backward() method?

不羁的心 提交于 2019-11-27 10:33:45
问题 I'm going through the neural transfer pytorch tutorial and am confused about the use of retain_variable (deprecated, now referred to as retain_graph ). The code example show: class ContentLoss(nn.Module): def __init__(self, target, weight): super(ContentLoss, self).__init__() self.target = target.detach() * weight self.weight = weight self.criterion = nn.MSELoss() def forward(self, input): self.loss = self.criterion(input * self.weight, self.target) self.output = input return self.output def

How to do automatic differentiation on complex datatypes?

这一生的挚爱 提交于 2019-11-27 03:44:17
问题 Given a very simple Matrix definition based on Vector: import Numeric.AD import qualified Data.Vector as V newtype Mat a = Mat { unMat :: V.Vector a } scale' f = Mat . V.map (*f) . unMat add' a b = Mat $ V.zipWith (+) (unMat a) (unMat b) sub' a b = Mat $ V.zipWith (-) (unMat a) (unMat b) mul' a b = Mat $ V.zipWith (*) (unMat a) (unMat b) pow' a e = Mat $ V.map (^e) (unMat a) sumElems' :: Num a => Mat a -> a sumElems' = V.sum . unMat (for demonstration purposes ... I am using hmatrix but