RuntimeError:梯度运算所需的变量之一是否已被就地操作修改?

整洁

我正在pytorch-1.5做一些gan测试。我的代码是非常简单的gan代码,仅适合sin(x)函数:

import torch
import torch.nn as nn
import numpy as np
import matplotlib.pyplot as plt


# Hyper Parameters
BATCH_SIZE = 64
LR_G = 0.0001
LR_D = 0.0001 
N_IDEAS = 5  
ART_COMPONENTS = 15 
PAINT_POINTS = np.vstack([np.linspace(-1, 1, ART_COMPONENTS) for _ in range(BATCH_SIZE)])


def artist_works():  # painting from the famous artist (real target)
    r = 0.02 * np.random.randn(1, ART_COMPONENTS)
    paintings = np.sin(PAINT_POINTS * np.pi) + r
    paintings = torch.from_numpy(paintings).float()
    return paintings


G = nn.Sequential(  # Generator
    nn.Linear(N_IDEAS, 128),  # random ideas (could from normal distribution)
    nn.ReLU(),
    nn.Linear(128, ART_COMPONENTS),  # making a painting from these random ideas
)

D = nn.Sequential(  # Discriminator
    nn.Linear(ART_COMPONENTS, 128),  # receive art work either from the famous artist or a newbie like G
    nn.ReLU(),
    nn.Linear(128, 1),
    nn.Sigmoid(),  # tell the probability that the art work is made by artist
)

opt_D = torch.optim.Adam(D.parameters(), lr=LR_D)
opt_G = torch.optim.Adam(G.parameters(), lr=LR_G)


for step in range(10000):
    artist_paintings = artist_works()  # real painting from artist
    G_ideas = torch.randn(BATCH_SIZE, N_IDEAS)  # random ideas
    G_paintings = G(G_ideas)  # fake painting from G (random ideas)

    prob_artist0 = D(artist_paintings)  # D try to increase this prob
    prob_artist1 = D(G_paintings)  # D try to reduce this prob

    D_loss = - torch.mean(torch.log(prob_artist0) + torch.log(1. - prob_artist1))
    G_loss = torch.mean(torch.log(1. - prob_artist1))

    opt_D.zero_grad()
    D_loss.backward(retain_graph=True)  # reusing computational graph
    opt_D.step()

    opt_G.zero_grad()
    G_loss.backward()
    opt_G.step()

但是当我运行它时出现此错误:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [128, 1]], which is output 0 of TBackward, is at version 2; expected version 1 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!

我的代码有问题吗?

伊里斯·卡尼奇(Ilyes KAANICH)

发生这种情况是因为opt_D.step()修改了鉴别器的参数。但是这些参数是计算发生器的梯度所必需的。您可以通过将代码更改为以下内容来解决此问题:

for step in range(10000):
    artist_paintings = artist_works()  # real painting from artist
    G_ideas = torch.randn(BATCH_SIZE, N_IDEAS)  # random ideas
    G_paintings = G(G_ideas)  # fake painting from G (random ideas)

    prob_artist1 = D(G_paintings)  # G tries to fool D

    G_loss = torch.mean(torch.log(1. - prob_artist1))
    opt_G.zero_grad()
    G_loss.backward()
    opt_G.step()

    prob_artist0 = D(artist_paintings)  # D try to increase this prob
    # detach here to make sure we don't backprop in G that was already changed.
    prob_artist1 = D(G_paintings.detach())  # D try to reduce this prob

    D_loss = - torch.mean(torch.log(prob_artist0) + torch.log(1. - prob_artist1))
    opt_D.zero_grad()
    D_loss.backward(retain_graph=True)  # reusing computational graph
    opt_D.step()

您可以在这里找到有关此问题的更多信息https://github.com/pytorch/pytorch/issues/39141

本文收集自互联网,转载请注明来源。

如有侵权,请联系 [email protected] 删除。

编辑于
0

我来说两句

0 条评论
登录 后参与评论

相关文章

遇到RuntimeError:梯度运算所需的变量之一已被就地操作修改

找不到引起“ RuntimeError:梯度计算所需的变量之一已被就地操作修改的变量”的就地操作:

RuntimeError:梯度运算所需的变量之一已通过就地操作进行了修改

Adam优化器错误:梯度运算所需的变量之一已通过就地操作进行了修改

由就地操作修改的变量之一

找不到原位运算:原位运算已修改了梯度计算所需的变量之一

当我运行我的网络。我得到一个错误,需要梯度计算的一个变量已经被修改通过就地操作

检查范围条件之一是否为点

“ __get__”参数之一是否多余?

如何检查几个元素之一是否存在?

如果索引叶变量用于梯度更新,如何解决就地操作错误?

调用存储过程,其中“ IN”变量之一是Express API的列表

jQuery检查具有相同类的输入之一是否为空

Linq:检查所有对象的列表中的属性之一是否为空

检查列表列之一是否在Linq,C#中的字符串列表中

在systemd单位中启用“ Accounting =”选项之一是否会使* all *单位启用该功能?

is_integral与is_integer:其中之一是否多余?

FluentValidation:检查两个字段之一是否为空

我怎么知道我的RAID 1驱动器之一是否死了?

我想检查输入值之一是否与其他输入值相同

Javascript ES2015检查对象中的命名键之一是否为空

如何检查查询之一是否在PHP MySQL中的数据范围内?

检查parentNode之一是否具有特定类的优雅方法

如何检查文档的字段之一是否存在于 Firebase 中?

检查多个单元格之一是否为空白

最好的 PHP 工具之一是 Bootstrap。

PyTorch:如何解决RuntimeError:就地操作只能用于不与任何其他变量共享存储的变量

检查我的选择标签之一是否与同一班级一起更改

确定文件是否已被修改