基于Pytorch深度学习神经网络MNIST手写数字识别系统源码(带界面和手写画板)

news2024/11/14 0:08:31

 第一步:准备数据

mnist开源数据集

第二步:搭建模型

我们这里搭建了一个LeNet5网络

参考代码如下:

import torch
from torch import nn


class Reshape(nn.Module):
    def forward(self, x):
        return x.view(-1, 1, 28, 28)


class LeNet5(nn.Module):
    def __init__(self):
        super(LeNet5, self).__init__()
        self.net = nn.Sequential(
            Reshape(),

            # CONV1, ReLU1, POOL1
            nn.Conv2d(in_channels=1, out_channels=6, kernel_size=5, padding=2),
            # nn.Conv2d(in_channels=1, out_channels=6, kernel_size=5),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2),

            # CONV2, ReLU2, POOL2
            nn.Conv2d(in_channels=6, out_channels=16, kernel_size=5),
            nn.ReLU(),
            nn.MaxPool2d(kernel_size=2, stride=2),
            nn.Flatten(),

            # FC1
            nn.Linear(in_features=16 * 5 * 5, out_features=120),
            nn.ReLU(),

            # FC2
            nn.Linear(in_features=120, out_features=84),
            nn.ReLU(),

            # FC3
            nn.Linear(in_features=84, out_features=10)
        )
        # 添加softmax层
        self.softmax = nn.Softmax()

    def forward(self, x):
        logits = self.net(x)
        # 将logits转为概率
        prob = self.softmax(logits)
        return prob


if __name__ == '__main__':
	model = LeNet5()
	X = torch.rand(size=(256, 1, 28, 28), dtype=torch.float32)
	for layer in model.net:
	    X = layer(X)
	    print(layer.__class__.__name__, '\toutput shape: \t', X.shape)
	X = torch.rand(size=(1, 1, 28, 28), dtype=torch.float32)
	print(model(X))

第三步:训练代码

import torch
from torch import nn
from torchvision import datasets
from torchvision.transforms import ToTensor
from torch.utils.data import DataLoader

from model import LeNet5


# DATASET
train_data = datasets.MNIST(
	root='./data',
	train=False,
	download=True,
	transform=ToTensor()
)

test_data = datasets.MNIST(
	root='./data',
	train=False,
	download=True,
	transform=ToTensor()
)


# PREPROCESS
batch_size = 256
train_dataloader = DataLoader(dataset=train_data, batch_size=batch_size)
test_dataloader = DataLoader(dataset=test_data, batch_size=batch_size)
for X, y in train_dataloader:
	print(X.shape)		# torch.Size([256, 1, 28, 28])
	print(y.shape)		# torch.Size([256])
	break


# MODEL
device = 'cuda' if torch.cuda.is_available() else 'cpu'
model = LeNet5().to(device)


# TRAIN MODEL
loss_func = nn.CrossEntropyLoss()
optimizer = torch.optim.Adam(params=model.parameters())

def train(dataloader, model, loss_func, optimizer, epoch):
	model.train()
	data_size = len(dataloader.dataset)
	for batch, (X, y) in enumerate(dataloader):
		X, y = X.to(device), y.to(device)

		y_hat = model(X)
		loss = loss_func(y_hat, y)

		optimizer.zero_grad()
		loss.backward()
		optimizer.step()

	loss, current = loss.item(), batch * len(X)
	print(f'EPOCH{epoch+1}\tloss: {loss:>7f}', end='\t')


# Test model
def test(dataloader, model, loss_fn):
	size = len(dataloader.dataset)
	num_batches = len(dataloader)
	model.eval()
	test_loss, correct = 0, 0
	with torch.no_grad():
		for X, y in dataloader:
			X, y = X.to(device), y.to(device)
			pred = model(X)
			test_loss += loss_fn(pred, y).item()
			correct += (pred.argmax(1) == y).type(torch.float).sum().item()
	test_loss /= num_batches
	correct /= size
	print(f'Test Error: Accuracy: {(100 * correct):>0.1f}%, Average loss: {test_loss:>8f}\n')


if __name__ == '__main__':
	epoches = 80
	for epoch in range(epoches):
		train(train_dataloader, model, loss_func, optimizer, epoch)
		test(test_dataloader, model, loss_func)

	# Save models
	torch.save(model.state_dict(), 'model.pth')
	print('Saved PyTorch LeNet5 State to model.pth')

第四步:统计训练过程

EPOCH1	loss: 1.908403	Test Error: Accuracy: 58.3%, Average loss: 1.943602

EPOCH2	loss: 1.776060	Test Error: Accuracy: 72.2%, Average loss: 1.750917

EPOCH3	loss: 1.717706	Test Error: Accuracy: 73.6%, Average loss: 1.730332

EPOCH4	loss: 1.719344	Test Error: Accuracy: 76.0%, Average loss: 1.703456

EPOCH5	loss: 1.659312	Test Error: Accuracy: 76.6%, Average loss: 1.694500

EPOCH6	loss: 1.647946	Test Error: Accuracy: 76.9%, Average loss: 1.691286

EPOCH7	loss: 1.653712	Test Error: Accuracy: 77.0%, Average loss: 1.690819

EPOCH8	loss: 1.653270	Test Error: Accuracy: 76.8%, Average loss: 1.692459

EPOCH9	loss: 1.649021	Test Error: Accuracy: 77.5%, Average loss: 1.686158

EPOCH10	loss: 1.648204	Test Error: Accuracy: 78.3%, Average loss: 1.678802

EPOCH11	loss: 1.647159	Test Error: Accuracy: 78.4%, Average loss: 1.676133

EPOCH12	loss: 1.647390	Test Error: Accuracy: 78.6%, Average loss: 1.674455

EPOCH13	loss: 1.646807	Test Error: Accuracy: 78.4%, Average loss: 1.675752

EPOCH14	loss: 1.630824	Test Error: Accuracy: 79.1%, Average loss: 1.668470

EPOCH15	loss: 1.524222	Test Error: Accuracy: 86.3%, Average loss: 1.599240

EPOCH16	loss: 1.524022	Test Error: Accuracy: 86.7%, Average loss: 1.594947

EPOCH17	loss: 1.524296	Test Error: Accuracy: 87.1%, Average loss: 1.588946

EPOCH18	loss: 1.523599	Test Error: Accuracy: 87.3%, Average loss: 1.588275

EPOCH19	loss: 1.523655	Test Error: Accuracy: 87.5%, Average loss: 1.586576

EPOCH20	loss: 1.523659	Test Error: Accuracy: 88.2%, Average loss: 1.579286

EPOCH21	loss: 1.523733	Test Error: Accuracy: 87.9%, Average loss: 1.582472

EPOCH22	loss: 1.523748	Test Error: Accuracy: 88.2%, Average loss: 1.578699

EPOCH23	loss: 1.523788	Test Error: Accuracy: 88.0%, Average loss: 1.579700

EPOCH24	loss: 1.523708	Test Error: Accuracy: 88.1%, Average loss: 1.579758

EPOCH25	loss: 1.523683	Test Error: Accuracy: 88.4%, Average loss: 1.575913

EPOCH26	loss: 1.523646	Test Error: Accuracy: 88.7%, Average loss: 1.572831

EPOCH27	loss: 1.523654	Test Error: Accuracy: 88.9%, Average loss: 1.570528

EPOCH28	loss: 1.523642	Test Error: Accuracy: 89.0%, Average loss: 1.570223

EPOCH29	loss: 1.523663	Test Error: Accuracy: 89.0%, Average loss: 1.570385

EPOCH30	loss: 1.523658	Test Error: Accuracy: 88.9%, Average loss: 1.571195

EPOCH31	loss: 1.523653	Test Error: Accuracy: 88.4%, Average loss: 1.575981

EPOCH32	loss: 1.523653	Test Error: Accuracy: 89.0%, Average loss: 1.570087

EPOCH33	loss: 1.523642	Test Error: Accuracy: 88.9%, Average loss: 1.571018

EPOCH34	loss: 1.523649	Test Error: Accuracy: 89.0%, Average loss: 1.570439

EPOCH35	loss: 1.523629	Test Error: Accuracy: 90.4%, Average loss: 1.555473

EPOCH36	loss: 1.461187	Test Error: Accuracy: 97.1%, Average loss: 1.491042

EPOCH37	loss: 1.461230	Test Error: Accuracy: 97.7%, Average loss: 1.485049

EPOCH38	loss: 1.461184	Test Error: Accuracy: 97.7%, Average loss: 1.485653

EPOCH39	loss: 1.461156	Test Error: Accuracy: 98.2%, Average loss: 1.479966

EPOCH40	loss: 1.461335	Test Error: Accuracy: 98.2%, Average loss: 1.479197

EPOCH41	loss: 1.461152	Test Error: Accuracy: 98.7%, Average loss: 1.475477

EPOCH42	loss: 1.461153	Test Error: Accuracy: 98.7%, Average loss: 1.475124

EPOCH43	loss: 1.461153	Test Error: Accuracy: 98.9%, Average loss: 1.472885

EPOCH44	loss: 1.461151	Test Error: Accuracy: 99.1%, Average loss: 1.470957

EPOCH45	loss: 1.461156	Test Error: Accuracy: 99.1%, Average loss: 1.471141

EPOCH46	loss: 1.461152	Test Error: Accuracy: 99.1%, Average loss: 1.470793

EPOCH47	loss: 1.461151	Test Error: Accuracy: 98.8%, Average loss: 1.474548

EPOCH48	loss: 1.461151	Test Error: Accuracy: 99.1%, Average loss: 1.470666

EPOCH49	loss: 1.461151	Test Error: Accuracy: 99.1%, Average loss: 1.471546

EPOCH50	loss: 1.461151	Test Error: Accuracy: 99.0%, Average loss: 1.471407

EPOCH51	loss: 1.461151	Test Error: Accuracy: 98.8%, Average loss: 1.473795

EPOCH52	loss: 1.461164	Test Error: Accuracy: 98.2%, Average loss: 1.480009

EPOCH53	loss: 1.461151	Test Error: Accuracy: 99.2%, Average loss: 1.469931

EPOCH54	loss: 1.461152	Test Error: Accuracy: 99.2%, Average loss: 1.469916

EPOCH55	loss: 1.461151	Test Error: Accuracy: 98.9%, Average loss: 1.472574

EPOCH56	loss: 1.461151	Test Error: Accuracy: 98.6%, Average loss: 1.476035

EPOCH57	loss: 1.461151	Test Error: Accuracy: 98.2%, Average loss: 1.478933

EPOCH58	loss: 1.461150	Test Error: Accuracy: 99.4%, Average loss: 1.468186

EPOCH59	loss: 1.461151	Test Error: Accuracy: 99.4%, Average loss: 1.467602

EPOCH60	loss: 1.461151	Test Error: Accuracy: 99.1%, Average loss: 1.471206

EPOCH61	loss: 1.461151	Test Error: Accuracy: 98.8%, Average loss: 1.473356

EPOCH62	loss: 1.461151	Test Error: Accuracy: 99.2%, Average loss: 1.470242

EPOCH63	loss: 1.461150	Test Error: Accuracy: 99.1%, Average loss: 1.470826

EPOCH64	loss: 1.461151	Test Error: Accuracy: 98.7%, Average loss: 1.474476

EPOCH65	loss: 1.461150	Test Error: Accuracy: 99.3%, Average loss: 1.469116

EPOCH66	loss: 1.461150	Test Error: Accuracy: 99.4%, Average loss: 1.467823

EPOCH67	loss: 1.461150	Test Error: Accuracy: 99.5%, Average loss: 1.466486

EPOCH68	loss: 1.461152	Test Error: Accuracy: 99.3%, Average loss: 1.468688

EPOCH69	loss: 1.461150	Test Error: Accuracy: 99.5%, Average loss: 1.466256

EPOCH70	loss: 1.461150	Test Error: Accuracy: 99.5%, Average loss: 1.466588

EPOCH71	loss: 1.461150	Test Error: Accuracy: 99.6%, Average loss: 1.465280

EPOCH72	loss: 1.461150	Test Error: Accuracy: 99.4%, Average loss: 1.467110

EPOCH73	loss: 1.461151	Test Error: Accuracy: 99.6%, Average loss: 1.465245

EPOCH74	loss: 1.461150	Test Error: Accuracy: 99.5%, Average loss: 1.466551

EPOCH75	loss: 1.461150	Test Error: Accuracy: 99.5%, Average loss: 1.466001

EPOCH76	loss: 1.461150	Test Error: Accuracy: 99.3%, Average loss: 1.468074

EPOCH77	loss: 1.461151	Test Error: Accuracy: 99.6%, Average loss: 1.465709

EPOCH78	loss: 1.461150	Test Error: Accuracy: 99.5%, Average loss: 1.466567

EPOCH79	loss: 1.461150	Test Error: Accuracy: 99.6%, Average loss: 1.464922

EPOCH80	loss: 1.461150	Test Error: Accuracy: 99.6%, Average loss: 1.465109

第五步:搭建GUI界面

第六步:整个工程的内容

有训练代码和训练好的模型以及训练过程,提供数据,提供GUI界面代码,主要使用方法可以参考里面的“文档说明_必看.docx”

 代码的下载路径(新窗口打开链接)基于Pytorch深度学习神经网络MNIST手写数字识别系统源码(带界面和手写画板)

有问题可以私信或者留言,有问必答

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/1679743.html

如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!

相关文章

【Qt】Qt开源项目

1、Flameshot 截图工具 1.1 简介 Flameshot是一款功能强大但易于使用的屏幕截图软件,中文名称火焰截图。 Flameshot 简单易用并有一个CLI版本,所以可以从命令行来进行截图。 Flameshot 是一个Linux发行版中完全免费且开源的截图工具 1.2 源码 github:https://github.com…

C++学习一(主要对cin的理解)

#include<iostream> int main() {int sum 0, value 0;//读取数据直到遇到文件尾&#xff0c;计算所有读入的值的和while (std::cin >> value){ //等价于sumsumvaluesum value;}std::cout << "Sum is :" << sum << std::endl;sum …

Vue3实战笔记(22)—路由Vue-Router 实战指南(路由传参)

文章目录 前言一、路由router-link二、路由传参1.query方式2.params方式3.props传参 总结 前言 vue-router 是 Vue.js 官方路由管理器。它和 Vue.js 核心深度集成&#xff0c;让用 Vue.js 构建单页应用变得易如反掌。 前面提到过简单的使用路由&#xff0c;直到上文使用404界面…

计网面试干货---带你梳理常考的面试题

顾得泉&#xff1a;个人主页 个人专栏&#xff1a;《Linux操作系统》 《C从入门到精通》 《LeedCode刷题》 键盘敲烂&#xff0c;年薪百万&#xff01; 一、HTTP和HTTPS的区别 1.安全性&#xff1a;HTTPS通过SSL/TLS协议对数据进行加密处理&#xff0c;有效防止数据在传输过…

【考研数学】准备开强化,更「张宇」还是「武忠祥」?

数一125学长前来回答&#xff0c;选择哪位老师的课程&#xff0c;这通常取决于你的个人偏好和学习风格&#xff01; 张宇老师和武忠祥老师都是非常有经验的数学老师&#xff0c;他们的教学方法各有特点。 张宇老师的教学风格通常被认为是通俗易懂&#xff0c;善于将复杂的概念…

数据结构------二叉树经典习题1

博主主页: 码农派大星. 关注博主带你了解更多数据结构知识 1判断相同的树 OJ链接 这道题相对简单,运用我们常规的递归写法就能轻松写出 所以我们解题思路应该这样想: 1.如果p为空&#xff0c;q为空&#xff0c;那么就是两颗空树肯定相等 2.如果一个树为空另一棵树不为空那么…

计算机Java项目|Springboot高校心理教育辅导设计与实现

作者主页&#xff1a;编程指南针 作者简介&#xff1a;Java领域优质创作者、CSDN博客专家 、CSDN内容合伙人、掘金特邀作者、阿里云博客专家、51CTO特邀作者、多年架构师设计经验、腾讯课堂常驻讲师 主要内容&#xff1a;Java项目、Python项目、前端项目、人工智能与大数据、简…

论文阅读:基于改进 YOLOv5算法的密集动态目标检测方法

目录 概要 Motivation 整体架构流程 技术细节 小结 论文地址&#xff1a;基于改进YOLOv5算法的密集动态目标检测方法 - 中国知网 (cnki.net) 概要 目的&#xff1a;提出一种基于 YOLOv5改进的检测算法&#xff0c;解决密集动态目标检测精度低及易漏检的问题。 方法&…

利用远程控制软件FinalShell远程连接虚拟机上的Linux系统(Windows)

一. VMware Workstation 安装CentOS Linux操作系统 传送门&#xff1a;VMware Workstation 安装CentOS Linux操作系统 1.右键打开终端 2.输入ifconfig 找到ens33对应 inet的id&#xff0c;这个就是虚拟机的ip地址图中所示为&#xff1a;192.168.5.128 3.打开finalshell 如…

「AIGC」Python实现tokens算法

本文主要介绍通过python实现tokens统计,避免重复调用openai等官方api,开源节流。 一、设计思路 初始化tokenizer使用tokenizer将文本转换为tokens计算token的数量二、业务场景 2.1 首次加载依赖 2.2 执行业务逻辑 三、核心代码 from transformers import AutoTokenizer imp…

【RSGIS数据资源】2001-2021 年亚洲季风区主要国家作物种植制度数据集

文章目录 1. 数据集概况2. 数据格式3. 文件名命名规则4. 数据生产服务单位5. 元数据6. 数据引用与参考文献引用 1. 数据集概况 2001-2021 年亚洲季风区主要国家作物种植制度数据集&#xff08;ACIA500&#xff09;是结合MODIS 影像和现有的土地利用等多源数据&#xff0c;基于…

数据结构——直接插入排序

基本思想 再插入第i个元素时&#xff0c;前面i-1个已经排好序。 排序过程 初始状态&#xff08;假设第一个元素为有序&#xff0c;其余均为无序元素&#xff09; 问题一&#xff1a;如何构建初始的有序序列&#xff1f; 办法 将第一个记录看成是初始有序表&#xff0c;然后…

万字长文破解 AI 图片生成算法-Stable diffusion (第一篇)

想象一下&#xff1a;你闭上眼睛&#xff0c;脑海中构思一个场景&#xff0c;用简短的语言描述出来&#xff0c;然后“啪”的一声&#xff0c;一张栩栩如生的图片就出现在你眼前。这不再是科幻小说里才有的情节&#xff0c;而是Stable Diffusion——一种前沿的AI图片生成算法—…

有多少小于当前数字的数字

链接&#xff1a;https://leetcode.cn/problems/how-many-numbers-are-smaller-than-the-current-number/description/ 思路&#xff1a; 最简单的思路来说&#xff0c;就是双重for循环进行遍历&#xff0c;来判断个数&#xff0c; 优化思路&#xff0c;其中一个思路就是递推 …

首次曝光!我喂了半年主食冻干,喵状态真滴顶~

科学养猫理念的推广&#xff0c;使得主食冻干喂养越来越受到养猫者的欢迎。主食冻干不仅符合猫咪的自然饮食习惯&#xff0c;还能提供丰富的营养&#xff0c;有助于保持猫咪的口腔和消化系统健康。我家喂了半年主食冻干&#xff0c;猫咪的状态是真的不一样了&#xff01; 然而…

轻松玩转2.5GHz 12nm双核CPU实战—Black Box

在2.5GHz 12nm A72双核CPU项目物理设计中&#xff0c;BlackBox 类似于一个 Hard Macro&#xff0c;它内部的东西完全看不见&#xff0c;只是一个黑盒子&#xff0c;但是它又类似于一个 Module Boundary。它可以被改变形状&#xff0c;而且它可以被分配 pin 和被分割出去&#x…

C++自定义头文件使用(函数和类)

简单案例需求&#xff1a; 1&#xff0c;计算正方形和三角形的周长——函数 2&#xff0c;模拟不同类型的动物叫声——类 一、创建项目 C空项目 Class_Study 二、创建主函数 在源文件下添加新建项&#xff0c;main.cpp 三、自定义头文件——函数 需求&#xff1a;1&a…

K210开发板MicroPython开发环境搭建

一、安装CanMV IDE开发软件 1、进入如下连接 https://developer.canaan-creative.com/resource 2、点击下载 3、下一步 4、修改安装路径&#xff0c;下一步 5、接受许可下一步 6、下一步 7、安装 8、完成 9、区域①菜单栏&#xff1a;操作文件&#xff0c;使用工具等。…

【Python大数据】PySpark

CSDN不支持多个资源绑定&#xff0c;另外两个数据文件下载&#xff1a; 订单数据-json.zip search-log.zip Apache Spark是用于大规模数据(large-scala data)处理的统一(unified)分析引擎 简单来说&#xff0c;Spark是一款分布式的计算框架&#xff0c;用于调度成百上千的服…

Unity射击游戏开发教程:(13)如何在Unity中播放音效

在本文中,我将向大家展示一些为游戏添加声音的不同方法。 我们为游戏添加声音的第一种方法是播放背景音乐。在此,我们将创建游戏对象(“音频管理器”)并创建一个子游戏对象(“背景音乐”)。该子游戏对象将是播放音乐的对象,因此需要向其添加音频源组件。如果没有音频源组…