PyTorch入门:手把手教你构建神经网络

安装PyTorch并准备环境

确保已安装Python(建议3.7及以上版本)。通过PyTorch官网获取安装命令,例如使用pip安装CPU版本:

pip install torch torchvision

验证安装是否成功:

import torch
print(torch.__version__)

理解神经网络的基本组件

PyTorch的核心是torch.nn模块,包含层(如LinearConv2d)、激活函数(如ReLU)和损失函数(如MSELoss)。神经网络通过继承nn.Module类实现,需重写__init__forward方法。

构建一个简单的全连接网络

以下是一个用于MNIST分类的两层网络示例:

import torch.nn as nn
import torch.nn.functional as F

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.fc1 = nn.Linear(784, 128)  # 输入层到隐藏层
        self.fc2 = nn.Linear(128, 10)   # 隐藏层到输出层

    def forward(self, x):
        x = x.view(-1, 784)            # 展平输入
        x = F.relu(self.fc1(x))        # 激活函数
        x = self.fc2(x)
        return F.log_softmax(x, dim=1) # 多分类输出

加载数据并预处理

使用torchvision加载MNIST数据集:

from torchvision import datasets, transforms

transform = transforms.Compose([
    transforms.ToTensor(),
    transforms.Normalize((0.5,), (0.5,))
])

train_set = datasets.MNIST('data', train=True, download=True, transform=transform)
train_loader = torch.utils.data.DataLoader(train_set, batch_size=64, shuffle=True)

训练模型的关键步骤

初始化模型、优化器和损失函数:

model = Net()
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)
criterion = nn.NLLLoss()

训练循环示例:

for epoch in range(5):
    for images, labels in train_loader:
        optimizer.zero_grad()
        output = model(images)
        loss = criterion(output, labels)
        loss.backward()
        optimizer.step()
    print(f'Epoch {epoch+1}, Loss: {loss.item():.4f}')

模型评估与预测

在测试集上评估准确率:

test_set = datasets.MNIST('data', train=False, transform=transform)
test_loader = torch.utils.data.DataLoader(test_set, batch_size=64)

correct = 0
total = 0
with torch.no_grad():
    for images, labels in test_loader:
        outputs = model(images)
        _, predicted = torch.max(outputs.data, 1)
        total += labels.size(0)
        correct += (predicted == labels).sum().item()

print(f'Accuracy: {100 * correct / total:.2f}%')

扩展与优化建议

尝试调整网络结构(如增加隐藏层)、使用不同的优化器(如Adam)、添加正则化(如Dropout)或学习率调度。PyTorch的torchsummary库可帮助可视化网络结构:

from torchsummary import summary
summary(model, (1, 28, 28))

BbS.okacop020.info/PoSt/1120_837806.HtM
BbS.okacop021.info/PoSt/1120_378806.HtM
BbS.okacop022.info/PoSt/1120_582809.HtM
BbS.okacop023.info/PoSt/1120_440538.HtM
BbS.okacop024.info/PoSt/1120_296970.HtM
BbS.okacop025.info/PoSt/1120_237692.HtM
BbS.okacop026.info/PoSt/1120_146744.HtM
BbS.okacop027.info/PoSt/1120_727161.HtM
BbS.okacop028.info/PoSt/1120_330386.HtM
BbS.okacop029.info/PoSt/1120_379553.HtM
BbS.okacop020.info/PoSt/1120_547991.HtM
BbS.okacop021.info/PoSt/1120_454404.HtM
BbS.okacop022.info/PoSt/1120_249797.HtM
BbS.okacop023.info/PoSt/1120_702100.HtM
BbS.okacop024.info/PoSt/1120_850403.HtM
BbS.okacop025.info/PoSt/1120_237773.HtM
BbS.okacop026.info/PoSt/1120_329599.HtM
BbS.okacop027.info/PoSt/1120_016287.HtM
BbS.okacop028.info/PoSt/1120_704034.HtM
BbS.okacop029.info/PoSt/1120_635156.HtM
BbS.okacop020.info/PoSt/1120_908116.HtM
BbS.okacop021.info/PoSt/1120_751479.HtM
BbS.okacop022.info/PoSt/1120_162605.HtM
BbS.okacop023.info/PoSt/1120_710499.HtM
BbS.okacop024.info/PoSt/1120_314614.HtM
BbS.okacop025.info/PoSt/1120_399840.HtM
BbS.okacop026.info/PoSt/1120_803050.HtM
BbS.okacop027.info/PoSt/1120_981634.HtM
BbS.okacop028.info/PoSt/1120_465242.HtM
BbS.okacop029.info/PoSt/1120_895391.HtM
BbS.okacop030.info/PoSt/1120_014132.HtM
BbS.okacop031.info/PoSt/1120_664743.HtM
BbS.okacop032.info/PoSt/1120_816675.HtM
BbS.okacop033.info/PoSt/1120_220202.HtM
BbS.okacop034.info/PoSt/1120_554891.HtM
BbS.okacop035.info/PoSt/1120_563154.HtM
BbS.okacop036.info/PoSt/1120_784090.HtM
BbS.okacop037.info/PoSt/1120_928824.HtM
BbS.okacop038.info/PoSt/1120_501054.HtM
BbS.okacop039.info/PoSt/1120_542320.HtM
BbS.okacop030.info/PoSt/1120_915657.HtM
BbS.okacop031.info/PoSt/1120_503025.HtM
BbS.okacop032.info/PoSt/1120_440512.HtM
BbS.okacop033.info/PoSt/1120_387852.HtM
BbS.okacop034.info/PoSt/1120_669533.HtM
BbS.okacop035.info/PoSt/1120_190041.HtM
BbS.okacop036.info/PoSt/1120_084147.HtM
BbS.okacop037.info/PoSt/1120_864663.HtM
BbS.okacop038.info/PoSt/1120_222982.HtM
BbS.okacop039.info/PoSt/1120_151942.HtM
BbS.okacop030.info/PoSt/1120_178543.HtM
BbS.okacop031.info/PoSt/1120_106875.HtM
BbS.okacop032.info/PoSt/1120_866815.HtM
BbS.okacop033.info/PoSt/1120_255609.HtM
BbS.okacop034.info/PoSt/1120_379884.HtM
BbS.okacop035.info/PoSt/1120_942118.HtM
BbS.okacop036.info/PoSt/1120_578608.HtM
BbS.okacop037.info/PoSt/1120_158747.HtM
BbS.okacop038.info/PoSt/1120_972769.HtM
BbS.okacop039.info/PoSt/1120_548667.HtM
BbS.okacop030.info/PoSt/1120_261100.HtM
BbS.okacop031.info/PoSt/1120_252731.HtM
BbS.okacop032.info/PoSt/1120_424123.HtM
BbS.okacop033.info/PoSt/1120_244164.HtM
BbS.okacop034.info/PoSt/1120_014232.HtM
BbS.okacop035.info/PoSt/1120_233026.HtM
BbS.okacop036.info/PoSt/1120_440034.HtM
BbS.okacop037.info/PoSt/1120_838112.HtM
BbS.okacop038.info/PoSt/1120_020765.HtM
BbS.okacop039.info/PoSt/1120_721507.HtM
BbS.okacop030.info/PoSt/1120_264499.HtM
BbS.okacop031.info/PoSt/1120_714322.HtM
BbS.okacop032.info/PoSt/1120_617447.HtM
BbS.okacop033.info/PoSt/1120_413939.HtM
BbS.okacop034.info/PoSt/1120_120967.HtM
BbS.okacop035.info/PoSt/1120_330774.HtM
BbS.okacop036.info/PoSt/1120_943203.HtM
BbS.okacop037.info/PoSt/1120_660226.HtM
BbS.okacop038.info/PoSt/1120_198056.HtM
BbS.okacop039.info/PoSt/1120_645823.HtM

#牛客AI配图神器#

全部评论

相关推荐

26应届求职ing:你这是报了豆音四哥的班?双非本硕拿这两个项目写简历里投100多家嵌软也没什么面试,感觉项目简单了,很多人用
点赞 评论 收藏
分享
评论
点赞
收藏
分享

创作者周榜

更多
牛客网
牛客网在线编程
牛客网题解
牛客企业服务