生成对抗网络入门:Mnist手写数字生成

本文为为🔗365天深度学习训练营内部文章

原作者:K同学啊

 一 理论基础

  生成对抗网络(Generative Adversarial Networks,GAN)是近年来深度学习领域的一个热点方向。 GAN并不指代某一个具体的神经网络,而是指一类基于博弈思想而设计的神经网络。GAN由两个分别被称为生成器(Generator)和判别器(Discriminator)的神经网络组成。其中,生成器从某种噪声分布中随机采样作为输入,输出与训练集中真实样本非常相似的人工样本;判别器的输入则为真实样本或人工样本,其目的是将人工样本与真实样本尽可能地区分出来。生成器和判别器交替运行,相互博弈,各自的能力都得到升。理想情况下,经过足够次数的博弈之后,判别器无法判断给定样本的真实性,即对于所有样本都输出50%真,50%假的判断。此时,生成器输出的人工样本已经逼真到使判别器无法分辨真假,停止博弈。这样就可以得到一个具有“伪造”真实样本能力的生成器。

1.生成器

  GANs中,生成器G选取随机噪声z作为输入,通过生成器的不断拟合,最终输出一个和真实样本尺寸相同,分布相似的伪造样本G(Z)。生成器的本质是一个使用生成式方法的模型,它对数据的分布假设和分布参数进行学习,然后根据学习到的模型重新采样出新的样本。
从数学上来说,生成式方法对于给定的真实数据,首先需要对数据的显式变量或隐含变量做分布假设;然后再将真实数据输入到模型中对变量、参数进行训练;最后得到一个学习后的近似分布,这个分布可以用来生成新的数据。从机器学习的角度来说,模型不会去做分布假设,而是通过不断地学习真实数据,对模型进行修正,最后也可以得到一个学习后的模型来做样本生成任务。这种方法不同于数学方法,学习的过程对人类理解较不直观。

2.判别器

  GANs中,判别器D对于输入的样本x,输出一个[0,1]之间的概率数值D(x)。x可能是来自于原始数据集中的真实样本x,也可能是来自于生成器G的人工样本G(Z)。通常约定,概率值D(x)越接近于1就代表此样本为真实样本的可能性更大;反之概率值越小则此样本为伪造样本的可能性越大。也就是说,这里的判别器是一个二分类的神经网络分类器,目的不是判定输入数据的原始类别,而是区分输入样本的真伪。可以注意到,不管在生成器还是判别器中,样本的类别信息都没有用到,也表明GAN 是一个无监督的学习过程。

3.理论原理 

  GAN是博弈论和机器学习相结合的产物,于2014年lan Goodfellow的论文中问世,一经问世即火爆足以看出人们对于这种算法的认可和狂热的研究热忱。想要更详细的了解GAN,就要知道它是怎么来的,以及这种算法出现的意义是什么。研究者最初想要通过计算机完成自动生成数据的功能,例如通过训练某种算法模型,让某模型学习过一些苹果的图片后能自动生成苹果的图片,具备些功能的算法即认为具有生成功能。但是GAN不是第一个生成算法,而是以往的生成算法在衡量生成图片和真实图片的差距时采用均方误差作为损失函数,但是研究者发现有时均方误差一样的两张生成图片效果却截然不同,鉴于此不足lan Goodfellow提出了GAN。

 那么GAN是如何完成生成图片这项功能的呢,如图1所示,GAN是由两个模型组成的:生成模型G和判别模型D。首先第一代生成模型1G的输入是随机噪声z,然后生成模型会生成一张初级照片,训练一代判别模型1D另其进行二分类操作,将生成的图片判别为0,而真实图片判别为1;为了欺瞒一代鉴别器,于是一代生成模型开始优化,然后它进阶成了二代,当它生成的数据成功欺瞒1D时,鉴别模型也会优化更新,进而升级为2D,按照同样的过程也会不断更新出N代的G和D。

二 前期工作 

1.定义超参数

import argparse
import os
import numpy as np
import torchvision.transforms as transforms
from torch.utils.data import DataLoader
from torchvision.utils import save_image
from torchvision import datasets
from torch.autograd import Variable
import torch.nn as nn
import torch
import warnings
warnings.filterwarnings('ignore')# 创建文件夹
os.makedirs('./images/',exist_ok=True)     # 记录训练过程的图片效果
os.makedirs('./save/',exist_ok=True)     # 训练完成时模型保存的位置
os.makedirs('./datasets/mnist',exist_ok=True)   # 下载数据集存放的位置
# 超参数配置
n_epochs = 50    # 这个参数决定了模型训练的总轮数。轮数越多,模型有更多机会学习数据中的模式,但也可能导致过拟合
batch_size = 64   # 批次大小影响模型每次更新时使用的数据量。较小的批次可能导致训练过程波动较大,但可能有助于模型逃离局部最小值;较大的批次则可能使训练更稳定,但需要更多的内存空间
lr = 0.0002    # 学习率控制着模型权重更新的步长。学习率过大可能导致模型在最优解附近震荡甚至发散;学习率过小则可能导致模型收敛速度缓慢或陷入局部最小值
b1 = 0.5    # b1和b2是Adam优化器的一部分,分别控制一阶矩(梯度的指数移动平均)和二阶矩(梯度平方的指数移动平均)的指数衰减率,它们影响模型更新的稳定性和收敛速度
b2 = 0.999
n_cpu = 2    # 指定了用于数据加载的cpu数量,可以影响数据预处理和加载的速度,进而影响训练的效率
latent_dim = 100   # 随机向量的维度,它影响生成器生成图像的多样性和质量。维度过低可能导致生成图像缺乏多样性,而维度过高可能导致模型难以训练
img_size = 28   # 图像的大小直接影响模型的感受野和所需计算资源。图像尺寸越大,模型可能需要更多的计算资源和更长的训练时间
channels = 1   # 图像的通道数,对于彩色图像通常是3(RGB),对于灰度图像是1.通道数影响模型处理的信息量
sample_interval = 500   # 保存生成图像的间隔,决定了我们在训练过程中多久保存一次生成的图像,用于监控生成图像的质量# 图像的尺寸:(1,28,28),和图像的像素面积:(784)
img_shape = (channels,img_size,img_size)
img_area = np.prod(img_shape)# 设置cuda
cuda = True if torch.cuda.is_available() else False

2.下载数据

# mnist数据集下载
mnist = datasets.MNIST(root='./datasets/',train=True,download=True,transform=transforms.Compose([transforms.Resize(img_size),transforms.ToTensor(),transforms.Normalize([0.5],[0.5])])
)

 3.配置数据

# 配置数据到加载器
dataloader = DataLoader(mnist,batch_size=batch_size,shuffle=True
)

 三 定义模型

1.判别器模型

'''
定义鉴别器
'''
# 将图片28*28展开成784,然后通过多层感知器,中间经过斜率设置为0.2的LeakyReLu激活函数
# 最后接sigmoid激活函数得到一个0到1之间的概率进行二分类
class Discriminator(nn.Module):def __init__(self):super(Discriminator,self).__init__()self.model = nn.Sequential(nn.Linear(img_area,512),     # 输入特征数为784,输出为512nn.LeakyReLU(0.2,inplace=True),      # 进行非线性映射nn.Linear(512,256),     # 输入特征数是512,输出为256nn.LeakyReLU(0.2,inplace=True),      # 进行非线性映射nn.Linear(256,1),       #  输入特征数是256,输出为1nn.Sigmoid()       # 二分类问题用sigmoid函数,多分类用softmax函数)def forward(self,img):img_flat = img.view(img.size(0),-1)  # 鉴别器输入是一个被view展开的(784)的一维图像:(64,784)validity = self.model(img_flat)     # 通过鉴别器网络return validity    # 鉴别器返回的是一个[0,1]间的概率

 2.生成器模型

'''
定义生成器
'''
# 输入一个100维的0~1之间的高斯分布,然后通过第一层线性变换将其映射到256维
# 然后通过LeakyRelu激活函数,接着进行一个线性变换,再经过一个LeakyRelu激活函数
# 然后经过线性变换将其变成784维,最后经过Tanh激活函数是希望生成的假的图片数据分布,能够在-1~1之间
class Generator(nn.Module):def __init__(self):super(Generator,self).__init__()# 模型中间块def block(in_feat,out_feat,normalize=True):layers = [nn.Linear(in_feat,out_feat)]     # 线性变换将输入映射到out维度if normalize:layers.append(nn.BatchNorm1d(out_feat,0.8))    # 正则化layers.append(nn.LeakyReLU(0.2,inplace=True))  # 非线性激活函数return layers# prod():返回给定轴上的数组元素的乘积:1*28*28 = 784self.model = nn.Sequential(*block(latent_dim,128,normalize=False),     # 线性变化将输入映射 100 to 128,正则化,LeakyRelu*block(128,256),    # 线性变化将输入映射 128 to 256,正则化,LeakyRelu*block(256,512),    # 线性变化将输入映射 256 to 512,正则化,LeakyRelu*block(512,1024),    # 线性变化将输入映射 512 to 1024,正则化,LeakyRelunn.Linear(1024,img_area),     # 线性变化将输入映射 1024 to 784nn.Tanh()     # 将(784)的数据每一个都映射到[-1,1]之间)# view():相当于numpy中的reshape,重新定义矩阵的形状,这里是reshape(64,1,28,28)def forward(self,z):          # 输入的是(64,100)的噪声数据imgs = self.model(z)      # 噪声数据通过生成器模型imgs = imgs.view(imgs.size(0),*img_shape)    # reshape成(64,1,28,28)return imgs   # 输出为64张大小为(1,28,28)的图像

 四 训练模型

1.训练模型

'''
训练模型
'''
# 创建生成器,鉴别器对象
generator = Generator()
discriminator = Discriminator()# 定义loss的度量方式(二分类的交叉熵)
criterion = torch.nn.BCELoss()# 定义优化函数,优化函数的学习率为0.0003
# betas:用于计算梯度以及梯度评分的运行平均值的系数
optimizer_G = torch.optim.Adam(generator.parameters(),lr=lr,betas=(b1,b2))
optimizer_D = torch.optim.Adam(discriminator.parameters(),lr=lr,betas=(b1,b2))# 如果有显卡,都在cuda模式中运行
if torch.cuda.is_available():generator = generator.cuda()discriminator = discriminator.cuda()criterion = criterion.cuda()# 训练模型
for epoch in range(n_epochs):for i,(imgs,_) in enumerate(dataloader):# ========================训练判别器======================# view():相当于numpy中的reshape,重新定义矩阵的形状,相当于reshape(128,784),原来是reshape(64,1,28,28)imgs = imgs.view(imgs.size(0),-1)   # 将图片展开为28*28=784   imgs:(64,784)real_img = Variable(imgs)    # 将tensor变成Variable放入计算图中,tensor变成variable之后才能进行反向传播求梯度real_label = Variable(torch.ones(imgs.size(0),1))     # 定义真实的图片label为1fake_label = Variable(torch.zeros(imgs.size(0),1))     # 定义假的图片label为0# -----------------------------------------------------# Train Discriminator# 分为两部分:1、真的图像判别为真  2、假的图像判别为假# -----------------------------------------------------# 计算真实图片的损失real_out = discriminator(real_img)   # 将真实图片放入判别器中loss_real_D = criterion(real_out,real_label)   # 得到真实图片的lossreal_scores = real_out     # 得到真实图片的判别值,输出的值越接近1越好# 计算假的图片的损失# detach():从当前计算图中分离下来避免梯度传到G,因为G不用更新z = Variable(torch.randn(imgs.size(0),latent_dim))    # 随机生成一些噪声fake_img = generator(z).detach()   # 随机噪声放入生成网络中,生成一张假的图片fake_out = discriminator(fake_img)    # 鉴别器判断假的图片loss_fake_D = criterion(fake_out,fake_label)    # 得到假的图片的lossfake_scores = fake_out    # 得到假图片的判别值,对于判别器来说,假图片的损失越接近0越好# 损失函数和优化loss_D = loss_real_D + loss_fake_D   # 损失包括判真损失和判假损失optimizer_D.zero_grad()   # 在反向传播之前,先将梯度归0loss_D.backward()  # 将误差反向传播optimizer_D.step()   # 更新参数# ---------------------# Train Generator# 原理:目的是希望生成假的图片被判别器判断为真# 在此过程中,将判别器固定,将假的图片传入判别器的结果与真实的label对应# 反向传播更新的参数是生成网络里面的参数# 这样可以通过更新生成网络里面的参数,来训练网络,使得生成的图片让判别器以为是真的# ---------------------z = Variable(torch.randn(imgs.size(0),latent_dim))   # 得到随机噪声fake_img = generator(z)     # 随机噪声输入到生成器中,得到一个假的图片output = discriminator(fake_img)     # 经过判别器得到的结果# 损失函数和优化loss_G = criterion(output,real_label)      # 得到的假的图片与真实图片的label的lossoptimizer_G.zero_grad()    # 在反向传播之前,先将梯度归0loss_G.backward()        # 将误差反向传播optimizer_G.step()      # 更新参数# 输出日志if (i + 1) % 300 == 0:print("[Epoch %d/%d] [Batch %d/%d]  [D loss:%f] [G loss:%f]  [D real:%f] [D fake:%f]"%(epoch,n_epochs,i,len(dataloader),loss_D.item(),loss_G.item(),real_scores.data.mean(),fake_scores.data.mean()))# 保存训练过程中的图片batches_done = epoch * len(dataloader) + iif batches_done % sample_interval == 0:save_image(fake_img.data[:25],"./images/%d.png"%batches_done,nrow=5,normalize=True)
[Epoch 0/50] [Batch 299/938]  [D loss:1.126462] [G loss:0.899555]  [D real:0.516139] [D fake:0.339249]
[Epoch 0/50] [Batch 599/938]  [D loss:1.149556] [G loss:0.897094]  [D real:0.544126] [D fake:0.379397]
[Epoch 0/50] [Batch 899/938]  [D loss:0.996889] [G loss:1.260947]  [D real:0.650683] [D fake:0.410245]
[Epoch 1/50] [Batch 299/938]  [D loss:1.115366] [G loss:2.147109]  [D real:0.732169] [D fake:0.522674]
[Epoch 1/50] [Batch 599/938]  [D loss:1.085733] [G loss:2.925582]  [D real:0.830789] [D fake:0.563939]
[Epoch 1/50] [Batch 899/938]  [D loss:1.212135] [G loss:2.759920]  [D real:0.856782] [D fake:0.635772]
[Epoch 2/50] [Batch 299/938]  [D loss:1.120076] [G loss:1.927145]  [D real:0.809302] [D fake:0.573418]
[Epoch 2/50] [Batch 599/938]  [D loss:0.918613] [G loss:1.235865]  [D real:0.622094] [D fake:0.307855]
[Epoch 2/50] [Batch 899/938]  [D loss:0.959764] [G loss:1.823482]  [D real:0.820892] [D fake:0.509622]
[Epoch 3/50] [Batch 299/938]  [D loss:0.853248] [G loss:1.786410]  [D real:0.744379] [D fake:0.380615]
[Epoch 3/50] [Batch 599/938]  [D loss:0.892074] [G loss:2.111202]  [D real:0.760768] [D fake:0.390234]
[Epoch 3/50] [Batch 899/938]  [D loss:0.989855] [G loss:2.272981]  [D real:0.836766] [D fake:0.510386]
[Epoch 4/50] [Batch 299/938]  [D loss:0.907261] [G loss:2.853382]  [D real:0.838500] [D fake:0.472724]
[Epoch 4/50] [Batch 599/938]  [D loss:1.158909] [G loss:0.798443]  [D real:0.518166] [D fake:0.145365]
[Epoch 4/50] [Batch 899/938]  [D loss:0.811972] [G loss:2.601337]  [D real:0.823196] [D fake:0.405719]
[Epoch 5/50] [Batch 299/938]  [D loss:0.591123] [G loss:2.082723]  [D real:0.771063] [D fake:0.245541]
[Epoch 5/50] [Batch 599/938]  [D loss:0.647286] [G loss:1.995042]  [D real:0.787497] [D fake:0.275002]
[Epoch 5/50] [Batch 899/938]  [D loss:0.620029] [G loss:1.987055]  [D real:0.767942] [D fake:0.245935]
[Epoch 6/50] [Batch 299/938]  [D loss:0.920807] [G loss:1.601699]  [D real:0.762614] [D fake:0.389948]
[Epoch 6/50] [Batch 599/938]  [D loss:0.824215] [G loss:1.587954]  [D real:0.704582] [D fake:0.226340]
[Epoch 6/50] [Batch 899/938]  [D loss:0.718090] [G loss:1.939514]  [D real:0.758678] [D fake:0.297810]
[Epoch 7/50] [Batch 299/938]  [D loss:0.750032] [G loss:1.268215]  [D real:0.703582] [D fake:0.235967]
[Epoch 7/50] [Batch 599/938]  [D loss:0.783685] [G loss:2.126447]  [D real:0.778962] [D fake:0.343705]
[Epoch 7/50] [Batch 899/938]  [D loss:0.807744] [G loss:1.234891]  [D real:0.703821] [D fake:0.264793]
[Epoch 8/50] [Batch 299/938]  [D loss:1.123362] [G loss:0.812300]  [D real:0.537284] [D fake:0.226940]
[Epoch 8/50] [Batch 599/938]  [D loss:0.932809] [G loss:1.276933]  [D real:0.694338] [D fake:0.332504]
[Epoch 8/50] [Batch 899/938]  [D loss:0.973240] [G loss:0.877600]  [D real:0.537963] [D fake:0.156072]
[Epoch 9/50] [Batch 299/938]  [D loss:0.851623] [G loss:1.151848]  [D real:0.613663] [D fake:0.194390]
[Epoch 9/50] [Batch 599/938]  [D loss:0.757661] [G loss:2.185205]  [D real:0.798452] [D fake:0.331717]
[Epoch 9/50] [Batch 899/938]  [D loss:0.930353] [G loss:1.250855]  [D real:0.647663] [D fake:0.235231]
[Epoch 10/50] [Batch 299/938]  [D loss:0.846923] [G loss:1.422895]  [D real:0.705119] [D fake:0.298378]
[Epoch 10/50] [Batch 599/938]  [D loss:1.120350] [G loss:2.781064]  [D real:0.871781] [D fake:0.587554]
[Epoch 10/50] [Batch 899/938]  [D loss:0.824792] [G loss:2.011739]  [D real:0.735258] [D fake:0.330044]
[Epoch 11/50] [Batch 299/938]  [D loss:0.949749] [G loss:1.582255]  [D real:0.649686] [D fake:0.275324]
[Epoch 11/50] [Batch 599/938]  [D loss:0.982256] [G loss:1.346781]  [D real:0.648031] [D fake:0.319623]
[Epoch 11/50] [Batch 899/938]  [D loss:1.134111] [G loss:0.734665]  [D real:0.502376] [D fake:0.201197]
[Epoch 12/50] [Batch 299/938]  [D loss:0.886618] [G loss:1.887888]  [D real:0.751914] [D fake:0.387430]
[Epoch 12/50] [Batch 599/938]  [D loss:0.980123] [G loss:1.845785]  [D real:0.826584] [D fake:0.499849]
[Epoch 12/50] [Batch 899/938]  [D loss:1.128903] [G loss:0.812115]  [D real:0.447066] [D fake:0.128298]
[Epoch 13/50] [Batch 299/938]  [D loss:1.307499] [G loss:1.138010]  [D real:0.444763] [D fake:0.118793]
[Epoch 13/50] [Batch 599/938]  [D loss:0.919566] [G loss:1.435858]  [D real:0.705751] [D fake:0.372891]
[Epoch 13/50] [Batch 899/938]  [D loss:1.045991] [G loss:0.795672]  [D real:0.551619] [D fake:0.233923]
[Epoch 14/50] [Batch 299/938]  [D loss:0.974308] [G loss:1.132812]  [D real:0.645558] [D fake:0.321130]
[Epoch 14/50] [Batch 599/938]  [D loss:1.077103] [G loss:1.873058]  [D real:0.763407] [D fake:0.485259]
[Epoch 14/50] [Batch 899/938]  [D loss:1.154649] [G loss:1.311222]  [D real:0.643791] [D fake:0.423526]
[Epoch 15/50] [Batch 299/938]  [D loss:1.055950] [G loss:1.662198]  [D real:0.711471] [D fake:0.457155]
[Epoch 15/50] [Batch 599/938]  [D loss:0.976650] [G loss:0.991488]  [D real:0.591312] [D fake:0.272113]
[Epoch 15/50] [Batch 899/938]  [D loss:0.972705] [G loss:1.357392]  [D real:0.667669] [D fake:0.373288]
[Epoch 16/50] [Batch 299/938]  [D loss:0.952374] [G loss:1.087495]  [D real:0.587684] [D fake:0.241297]
[Epoch 16/50] [Batch 599/938]  [D loss:0.904115] [G loss:1.359004]  [D real:0.717762] [D fake:0.368899]
[Epoch 16/50] [Batch 899/938]  [D loss:0.946697] [G loss:1.670658]  [D real:0.765068] [D fake:0.431645]
[Epoch 17/50] [Batch 299/938]  [D loss:0.997313] [G loss:0.943261]  [D real:0.616844] [D fake:0.312825]
[Epoch 17/50] [Batch 599/938]  [D loss:1.030199] [G loss:1.093042]  [D real:0.555989] [D fake:0.235301]
[Epoch 17/50] [Batch 899/938]  [D loss:0.911224] [G loss:1.282301]  [D real:0.661065] [D fake:0.313903]
[Epoch 18/50] [Batch 299/938]  [D loss:1.039436] [G loss:1.392826]  [D real:0.689580] [D fake:0.437706]
[Epoch 18/50] [Batch 599/938]  [D loss:1.082913] [G loss:0.873606]  [D real:0.558201] [D fake:0.300038]
[Epoch 18/50] [Batch 899/938]  [D loss:1.258200] [G loss:0.618452]  [D real:0.442770] [D fake:0.194379]
[Epoch 19/50] [Batch 299/938]  [D loss:1.082308] [G loss:1.268667]  [D real:0.520880] [D fake:0.255018]
[Epoch 19/50] [Batch 599/938]  [D loss:0.905622] [G loss:1.311422]  [D real:0.732448] [D fake:0.399873]
[Epoch 19/50] [Batch 899/938]  [D loss:1.077982] [G loss:1.436989]  [D real:0.724235] [D fake:0.479413]
[Epoch 20/50] [Batch 299/938]  [D loss:1.031461] [G loss:1.433434]  [D real:0.704877] [D fake:0.433269]
[Epoch 20/50] [Batch 599/938]  [D loss:0.979215] [G loss:1.721090]  [D real:0.748500] [D fake:0.444550]
[Epoch 20/50] [Batch 899/938]  [D loss:0.967548] [G loss:0.979543]  [D real:0.605029] [D fake:0.275011]
[Epoch 21/50] [Batch 299/938]  [D loss:1.008990] [G loss:1.505808]  [D real:0.700990] [D fake:0.414113]
[Epoch 21/50] [Batch 599/938]  [D loss:1.120533] [G loss:0.947614]  [D real:0.501168] [D fake:0.196343]
[Epoch 21/50] [Batch 899/938]  [D loss:0.963488] [G loss:1.843049]  [D real:0.777486] [D fake:0.464934]
[Epoch 22/50] [Batch 299/938]  [D loss:0.975867] [G loss:1.108254]  [D real:0.650325] [D fake:0.377432]
[Epoch 22/50] [Batch 599/938]  [D loss:0.957223] [G loss:1.135555]  [D real:0.639857] [D fake:0.328309]
[Epoch 22/50] [Batch 899/938]  [D loss:0.987199] [G loss:1.326054]  [D real:0.667016] [D fake:0.364796]
[Epoch 23/50] [Batch 299/938]  [D loss:0.920097] [G loss:1.332339]  [D real:0.706342] [D fake:0.359756]
[Epoch 23/50] [Batch 599/938]  [D loss:1.022273] [G loss:1.082345]  [D real:0.587763] [D fake:0.281549]
[Epoch 23/50] [Batch 899/938]  [D loss:0.908397] [G loss:1.259532]  [D real:0.649278] [D fake:0.304928]
[Epoch 24/50] [Batch 299/938]  [D loss:1.084111] [G loss:1.708223]  [D real:0.748224] [D fake:0.492710]
[Epoch 24/50] [Batch 599/938]  [D loss:1.118541] [G loss:1.251814]  [D real:0.624162] [D fake:0.374923]
[Epoch 24/50] [Batch 899/938]  [D loss:1.082891] [G loss:1.213567]  [D real:0.622567] [D fake:0.365029]
[Epoch 25/50] [Batch 299/938]  [D loss:1.071242] [G loss:1.101048]  [D real:0.715404] [D fake:0.451872]
[Epoch 25/50] [Batch 599/938]  [D loss:1.214661] [G loss:1.769220]  [D real:0.826745] [D fake:0.569088]
[Epoch 25/50] [Batch 899/938]  [D loss:1.042482] [G loss:1.574865]  [D real:0.720972] [D fake:0.444199]
[Epoch 26/50] [Batch 299/938]  [D loss:1.014647] [G loss:0.812489]  [D real:0.555258] [D fake:0.246729]
[Epoch 26/50] [Batch 599/938]  [D loss:0.972982] [G loss:1.268242]  [D real:0.722147] [D fake:0.392772]
[Epoch 26/50] [Batch 899/938]  [D loss:1.158464] [G loss:2.182199]  [D real:0.791383] [D fake:0.547543]
[Epoch 27/50] [Batch 299/938]  [D loss:1.016049] [G loss:0.993888]  [D real:0.645598] [D fake:0.349482]
[Epoch 27/50] [Batch 599/938]  [D loss:1.011526] [G loss:0.771782]  [D real:0.618560] [D fake:0.322423]
[Epoch 27/50] [Batch 899/938]  [D loss:1.097934] [G loss:1.229312]  [D real:0.591573] [D fake:0.330537]
[Epoch 28/50] [Batch 299/938]  [D loss:1.043149] [G loss:0.680922]  [D real:0.537576] [D fake:0.223805]
[Epoch 28/50] [Batch 599/938]  [D loss:0.924291] [G loss:1.150502]  [D real:0.682267] [D fake:0.325791]
[Epoch 28/50] [Batch 899/938]  [D loss:0.774712] [G loss:1.353207]  [D real:0.662420] [D fake:0.237354]
[Epoch 29/50] [Batch 299/938]  [D loss:1.098983] [G loss:1.458917]  [D real:0.708487] [D fake:0.456147]
[Epoch 29/50] [Batch 599/938]  [D loss:0.901726] [G loss:1.458206]  [D real:0.704136] [D fake:0.354276]
[Epoch 29/50] [Batch 899/938]  [D loss:1.024077] [G loss:1.027868]  [D real:0.530083] [D fake:0.195724]
[Epoch 30/50] [Batch 299/938]  [D loss:1.006195] [G loss:1.339055]  [D real:0.681568] [D fake:0.381492]
[Epoch 30/50] [Batch 599/938]  [D loss:1.139939] [G loss:0.850789]  [D real:0.540653] [D fake:0.274699]
[Epoch 30/50] [Batch 899/938]  [D loss:1.045264] [G loss:1.354315]  [D real:0.661704] [D fake:0.374420]
[Epoch 31/50] [Batch 299/938]  [D loss:0.906118] [G loss:1.087022]  [D real:0.646974] [D fake:0.287992]
[Epoch 31/50] [Batch 599/938]  [D loss:0.923142] [G loss:1.168598]  [D real:0.661384] [D fake:0.317574]
[Epoch 31/50] [Batch 899/938]  [D loss:0.893291] [G loss:1.127621]  [D real:0.610552] [D fake:0.239781]
[Epoch 32/50] [Batch 299/938]  [D loss:1.028418] [G loss:0.872905]  [D real:0.511972] [D fake:0.163022]
[Epoch 32/50] [Batch 599/938]  [D loss:1.001148] [G loss:1.375986]  [D real:0.634169] [D fake:0.332520]
[Epoch 32/50] [Batch 899/938]  [D loss:0.897700] [G loss:1.646899]  [D real:0.638676] [D fake:0.226422]
[Epoch 33/50] [Batch 299/938]  [D loss:1.021669] [G loss:0.766808]  [D real:0.583394] [D fake:0.284108]
[Epoch 33/50] [Batch 599/938]  [D loss:1.095916] [G loss:1.762437]  [D real:0.771493] [D fake:0.501423]
[Epoch 33/50] [Batch 899/938]  [D loss:0.873408] [G loss:1.385971]  [D real:0.706132] [D fake:0.343957]
[Epoch 34/50] [Batch 299/938]  [D loss:0.974229] [G loss:1.208778]  [D real:0.628312] [D fake:0.261362]
[Epoch 34/50] [Batch 599/938]  [D loss:0.958586] [G loss:0.977570]  [D real:0.575545] [D fake:0.224128]
[Epoch 34/50] [Batch 899/938]  [D loss:0.962942] [G loss:1.669462]  [D real:0.711120] [D fake:0.392588]
[Epoch 35/50] [Batch 299/938]  [D loss:0.941913] [G loss:1.235123]  [D real:0.636958] [D fake:0.302639]
[Epoch 35/50] [Batch 599/938]  [D loss:0.866773] [G loss:1.674663]  [D real:0.809084] [D fake:0.426007]
[Epoch 35/50] [Batch 899/938]  [D loss:0.839387] [G loss:1.347061]  [D real:0.681547] [D fake:0.276491]
[Epoch 36/50] [Batch 299/938]  [D loss:0.908666] [G loss:1.740739]  [D real:0.802489] [D fake:0.433976]
[Epoch 36/50] [Batch 599/938]  [D loss:0.747275] [G loss:1.465722]  [D real:0.680790] [D fake:0.228464]
[Epoch 36/50] [Batch 899/938]  [D loss:0.853031] [G loss:1.050651]  [D real:0.617018] [D fake:0.190504]
[Epoch 37/50] [Batch 299/938]  [D loss:0.992326] [G loss:0.987399]  [D real:0.668963] [D fake:0.380671]
[Epoch 37/50] [Batch 599/938]  [D loss:0.999288] [G loss:1.124590]  [D real:0.722034] [D fake:0.401539]
[Epoch 37/50] [Batch 899/938]  [D loss:1.128850] [G loss:1.283640]  [D real:0.474914] [D fake:0.138298]
[Epoch 38/50] [Batch 299/938]  [D loss:1.140573] [G loss:0.910530]  [D real:0.543450] [D fake:0.280660]
[Epoch 38/50] [Batch 599/938]  [D loss:1.125377] [G loss:1.274833]  [D real:0.623111] [D fake:0.358482]
[Epoch 38/50] [Batch 899/938]  [D loss:0.903505] [G loss:2.463851]  [D real:0.804626] [D fake:0.452510]
[Epoch 39/50] [Batch 299/938]  [D loss:1.029963] [G loss:0.861951]  [D real:0.537239] [D fake:0.246637]
[Epoch 39/50] [Batch 599/938]  [D loss:0.928971] [G loss:1.389888]  [D real:0.726214] [D fake:0.353216]
[Epoch 39/50] [Batch 899/938]  [D loss:0.850291] [G loss:1.156543]  [D real:0.665365] [D fake:0.278161]
[Epoch 40/50] [Batch 299/938]  [D loss:1.114792] [G loss:1.124351]  [D real:0.620180] [D fake:0.358419]
[Epoch 40/50] [Batch 599/938]  [D loss:1.211019] [G loss:1.504199]  [D real:0.654950] [D fake:0.419927]
[Epoch 40/50] [Batch 899/938]  [D loss:1.067284] [G loss:1.602810]  [D real:0.738925] [D fake:0.461606]
[Epoch 41/50] [Batch 299/938]  [D loss:0.979463] [G loss:0.973400]  [D real:0.615557] [D fake:0.287014]
[Epoch 41/50] [Batch 599/938]  [D loss:1.006179] [G loss:1.956477]  [D real:0.717201] [D fake:0.396797]
[Epoch 41/50] [Batch 899/938]  [D loss:0.866434] [G loss:1.753222]  [D real:0.715549] [D fake:0.302359]
[Epoch 42/50] [Batch 299/938]  [D loss:0.859131] [G loss:2.199386]  [D real:0.731800] [D fake:0.323501]
[Epoch 42/50] [Batch 599/938]  [D loss:1.212487] [G loss:2.213250]  [D real:0.892617] [D fake:0.590188]
[Epoch 42/50] [Batch 899/938]  [D loss:0.993835] [G loss:1.371387]  [D real:0.711200] [D fake:0.400020]
[Epoch 43/50] [Batch 299/938]  [D loss:0.937333] [G loss:1.240518]  [D real:0.594048] [D fake:0.165137]
[Epoch 43/50] [Batch 599/938]  [D loss:0.826414] [G loss:1.805223]  [D real:0.716086] [D fake:0.306703]
[Epoch 43/50] [Batch 899/938]  [D loss:1.078007] [G loss:1.593838]  [D real:0.745717] [D fake:0.453779]
[Epoch 44/50] [Batch 299/938]  [D loss:0.884524] [G loss:1.749227]  [D real:0.724763] [D fake:0.355184]
[Epoch 44/50] [Batch 599/938]  [D loss:0.966496] [G loss:0.809007]  [D real:0.577357] [D fake:0.226516]
[Epoch 44/50] [Batch 899/938]  [D loss:0.865563] [G loss:1.897072]  [D real:0.715572] [D fake:0.327211]
[Epoch 45/50] [Batch 299/938]  [D loss:0.919493] [G loss:1.926025]  [D real:0.669408] [D fake:0.297098]
[Epoch 45/50] [Batch 599/938]  [D loss:1.062480] [G loss:1.232262]  [D real:0.549454] [D fake:0.219422]
[Epoch 45/50] [Batch 899/938]  [D loss:0.863310] [G loss:1.340070]  [D real:0.694014] [D fake:0.295423]
[Epoch 46/50] [Batch 299/938]  [D loss:0.974231] [G loss:1.514071]  [D real:0.675866] [D fake:0.307070]
[Epoch 46/50] [Batch 599/938]  [D loss:0.935487] [G loss:1.674119]  [D real:0.819082] [D fake:0.461227]
[Epoch 46/50] [Batch 899/938]  [D loss:0.883260] [G loss:2.014895]  [D real:0.805989] [D fake:0.412275]
[Epoch 47/50] [Batch 299/938]  [D loss:1.042589] [G loss:1.013629]  [D real:0.577147] [D fake:0.206049]
[Epoch 47/50] [Batch 599/938]  [D loss:0.942141] [G loss:1.822692]  [D real:0.712124] [D fake:0.330109]
[Epoch 47/50] [Batch 899/938]  [D loss:1.027585] [G loss:1.700922]  [D real:0.733564] [D fake:0.393722]
[Epoch 48/50] [Batch 299/938]  [D loss:1.036207] [G loss:2.060518]  [D real:0.794986] [D fake:0.473033]
[Epoch 48/50] [Batch 599/938]  [D loss:0.881303] [G loss:1.338223]  [D real:0.683420] [D fake:0.288061]
[Epoch 48/50] [Batch 899/938]  [D loss:0.844696] [G loss:1.736678]  [D real:0.706084] [D fake:0.303232]
[Epoch 49/50] [Batch 299/938]  [D loss:0.932430] [G loss:1.783170]  [D real:0.700632] [D fake:0.345498]
[Epoch 49/50] [Batch 599/938]  [D loss:0.746427] [G loss:1.045450]  [D real:0.687943] [D fake:0.225172]
[Epoch 49/50] [Batch 899/938]  [D loss:0.903058] [G loss:1.841586]  [D real:0.741125] [D fake:0.375467]

2.保存模型 

# 保存模型
torch.save(generator.state_dict(),'./save/generator.pth')
torch.save(discriminator.state_dict(),'./save/discriminator.pth')

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/17080.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

22.4、Web应用漏洞分析与防护

目录 Web应用安全概述DWASP Top 10Web应用漏洞防护 - 跨站脚本攻击XSSWeb应用漏洞防护 - SQL注入Web应用漏洞防护 - 文件上传漏洞Web应用漏洞防护 - 跨站脚本攻击XSS Web应用安全概述 技术安全漏洞,主要是因为技术处理不当而产生的安全隐患,比如SQL注入…

软件的生命周期和需求

什么是软件的生命周期? 定义(描述) --> 创建 --> 使用 --> 销毁 (这一整个过程就是事物的生命周期) 生命周期 那么软件的生命周期又分为哪些呢? 一共分为十步: 可行性研究: 通过分析软件开发要求,确定软件项目的性质、目标和规模,得出可行性研究报告,如果可行性研…

深入理解DeepSeek与企业实践(二):32B多卡推理的原理、硬件散热与性能实测

前言 在《深入理解 DeepSeek 与企业实践(一):蒸馏、部署与评测》文章中,我们详细介绍了深度模型的蒸馏、量化技术,以及 7B 模型的部署基础,通常单张 GPU 显存即可满足7B模型完整参数的运行需求。然而&…

Java 字符编码与解码:深入理解 Charset 类

目录 引言 一、什么是字符集(Charset)? 二、Charset 类的核心功能 1. 获取字符集实例 2. 编码与解码 示例1:字符串转字节数组 示例2:处理不同字符集的乱码问题 3. 字符集检测与支持 三、Charset 类的常用方法…

Redis7.0八种数据结构底层原理

导读 本文介绍redis应用数据结构与物理存储结构,共八种应用数据结构和 一. 内部数据结构 1. sds sds是redis自己设计的字符串结构有以下特点: jemalloc内存管理预分配冗余空间二进制安全(c原生使用\0作为结尾标识,所以无法直接存储\0)动态计数类型(根据字符串长度动态选择…

本地Deepseek-r1:7b模型集成到Google网页中对话

本地Deepseek-r1:7b网页对话 基于上一篇本地部署的Deepseek-r1:7b,使用黑窗口对话不方便,现在将本地模型通过插件集成到Google浏览器中 安装Google插件 在Chrome应用商店中搜索page assis 直接添加至Chrome 修改一下语言 RAG设置本地运行的模型&#…

【设计模式】【行为型模式】观察者模式(Observer)

👋hi,我不是一名外包公司的员工,也不会偷吃茶水间的零食,我的梦想是能写高端CRUD 🔥 2025本人正在沉淀中… 博客更新速度 👍 欢迎点赞、收藏、关注,跟上我的更新节奏 🎵 当你的天空突…

gitlab Webhook 配置jenkins时“触发远程构建 (例如,使用脚本)”报错

报错信息&#xff1a; <html> <head> <meta http-equiv"Content-Type" content"text/html;charsetISO-8859-1"/> <title>Error 403 No valid crumb was included in the request</title> </head> <body><h2…

AI赋能前端开发:薪资潜力无限的未来

在当今竞争激烈的就业市场&#xff0c;掌握AI写代码工具等AI技能已经成为许多专业人士提升竞争力的关键。尤其在快速发展的前端开发领域&#xff0c;AI的应用更是日新月异&#xff0c;为开发者带来了前所未有的机遇。高薪职位对熟练掌握AI技术的前端开发者的需求与日俱增&#…

外包干了4年,技术退步太明显了。。。。。

先说一下自己的情况&#xff0c;本科生生&#xff0c;20年通过校招进入武汉某软件公司&#xff0c;干了差不多4年的功能测试&#xff0c;今年国庆&#xff0c;感觉自己不能够在这样下去了&#xff0c;长时间呆在一个舒适的环境会让一个人堕落!而我已经在一个企业干了四年的功能…

平面与平面相交算法杂谈

1.前言 空间平面方程&#xff1a; 空间两平面如果不平行&#xff0c;那么一定相交于一条空间直线&#xff0c; 空间平面求交有多种方法&#xff0c;本文进行相关讨论。 2.讨论 可以联立方程组求解&#xff0c;共有3个变量&#xff0c;2个方程&#xff0c;而所求直线有1个变量…

【状态空间方程】对于状态空间方程矩阵D≠0时的状态反馈与滑模控制

又到新的一年啦&#xff0c;2025新年快乐~。前几个月都没更新&#xff0c;主要还是因为不能把项目上的私密工作写进去&#xff0c;所以暂时没啥可写的。最近在山里实习&#xff0c;突然想起年前遗留了个问题一直没解决&#xff0c;没想到这两天在deepseek的加持下很快解决了&am…

LearningFlow:大语言模型城市驾驶的自动化策略学习工作流程

25年1月来自香港科技大学广州分校的论文“LearningFlow: Automated Policy Learning Workflow for Urban Driving with Large Language Models”。 强化学习 (RL) 的最新进展表明其在自动驾驶领域具有巨大潜力。尽管前景光明&#xff0c;但诸如手动设计奖励函数和复杂环境中的…

大语言模型多代理协作(MACNET)

大语言模型多代理协作(MACNET) Scaling Large-Language-Model-based Multi-Agent Collaboration 提出多智能体协作网络(MACNET),以探究多智能体协作中增加智能体数量是否存在类似神经缩放定律的规律。研究发现了小世界协作现象和协作缩放定律,为LLM系统资源预测和优化…

【OpenCV】双目相机计算深度图和点云

双目相机计算深度图的基本原理是通过两台相机从不同角度拍摄同一场景&#xff0c;然后利用视差来计算物体的距离。本文的Python实现示例&#xff0c;使用OpenCV库来处理图像和计算深度图。 1、数据集介绍 Mobile stereo datasets由Pan Guanghan、Sun Tiansheng、Toby Weed和D…

PT8032 3 通道触摸 IC

1. 概述 PT8032 是一款电容式触摸控制 ASIC &#xff0c;支持 3 通道触摸输入 ,2 线 BCD 码输出。具有低功耗、 高抗干扰、宽工作电压范围、高穿透力的突出优势。 2. 主要特性 工作电压范围&#xff1a; 2.4~5.5V 待机电流约 9uAV DD5V&CMOD10nF 3 通道触…

像指针操作、像函数操作的类

像指针一样的类。把一个类设计成像一个指针。什么操作符运用到指针上&#xff1f; 使用标准库的时候&#xff0c;里面有个很重要的东西叫容器。容器本身一定带着迭代器。迭代器作为另外一种智能指针。迭代器指向容器里的一个元素。迭代器用来遍历容器。 _list_iterator是链表迭…

Pikachu–XXE漏洞

Pikachu–XXE漏洞 一、XML基础概念 XML文档结构由XML声明&#xff0c;DTD(文档类型定义)&#xff0c;文档元素三部分构成&#xff01; #XML是可扩展标记语言(Extensible Markup Language),是设计用来进行数据的传输与存储。 #eg: <!--XML声明--><!--指明XML文档的版…

matlab-simulink

1、信号到对象解析指示符 代表的意义是&#xff1a;信号名称必须解析为信号对象 2、input inport 双击空白区域输入模块名字&#xff0c;自动联想显示相关模块 没看出太大的差别 3、Stateflow 双击空白区域输入stateflow、或者chart或者常用库里面去查找 4、离散时间积分…

简单几个步骤完成 Oracle 到金仓数据库(KingbaseES)的迁移目标

作为国产数据库的领军选手&#xff0c;金仓数据库&#xff08;KingbaseES&#xff09;凭借其成熟的技术架构和广泛的市场覆盖&#xff0c;在国内众多领域中扮演着至关重要的角色。无论是国家电网、金融行业&#xff0c;还是铁路、医疗等关键领域&#xff0c;金仓数据库都以其卓…