<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title>PyTorch - 分类 - mywebsite</title><link>https://steven-yl.github.io/mywebsite/categories/pytorch/</link><description>PyTorch - 分类 - mywebsite</description><generator>Hugo -- gohugo.io</generator><language>zh-CN</language><managingEditor>steven@gmail.com (Steven)</managingEditor><webMaster>steven@gmail.com (Steven)</webMaster><copyright>This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.</copyright><lastBuildDate>Fri, 03 Apr 2026 00:00:00 +0800</lastBuildDate><atom:link href="https://steven-yl.github.io/mywebsite/categories/pytorch/" rel="self" type="application/rss+xml"/><item><title>总览：TorchCode 知识架构与学习路径</title><link>https://steven-yl.github.io/mywebsite/00_overview/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/00_overview/</guid><description>算子级练习项目的整体架构与推荐阅读顺序。</description></item><item><title>PyTorch lr曲线</title><link>https://steven-yl.github.io/mywebsite/lr_function/</link><pubDate>Tue, 24 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/lr_function/</guid><description>lr曲线图</description></item><item><title>PyTorch 激活函数</title><link>https://steven-yl.github.io/mywebsite/active_function/</link><pubDate>Tue, 24 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/active_function/</guid><description>本文汇总 Sigmoid、Tanh、ReLU、GELU、Swish 等激活函数，并提供分组图与总览图。</description></item><item><title>PyTorch 分布式训练与操作工具技术文档</title><link>https://steven-yl.github.io/mywebsite/distributed_training_guide/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/distributed_training_guide/</guid><description>从进程组初始化、DDP 封装、数据分片、集体通信到 Lightning 封装，全面讲解如何在单机多卡与多机多卡场景下正确使用 PyTorch 分布式训练。</description></item><item><title>第一章：激活函数与基础组件（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/01_activations_and_fundamentals/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/01_activations_and_fundamentals/</guid><description>TorchCode 文档第一章：从最底层算子到损失函数。</description></item><item><title>PyTorch Dataset 体系技术文档</title><link>https://steven-yl.github.io/mywebsite/pytorch_dataset_guide/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/pytorch_dataset_guide/</guid><description>覆盖 map-style/IterableDataset、全部内置 Dataset 扩展、图数据与 HF datasets、典型项目扩展模式、padding 与 collate 职责划分，以及与 DataLoader 的衔接。</description></item><item><title>Pytorch 权重初始化方法</title><link>https://steven-yl.github.io/mywebsite/net_init/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/net_init/</guid><description>全面对比深度学习权重初始化方法的原理、公式推导、优缺点与适用场景，附 PyTorch 代码示例和 Transformer 架构初始化最佳实践。</description></item><item><title>第二章：归一化技术（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/02_normalization/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/02_normalization/</guid><description>TorchCode 文档第二章：归一化技术全解。</description></item><item><title>PyTorch DataLoader 技术解读</title><link>https://steven-yl.github.io/mywebsite/dataloader_guide/</link><pubDate>Thu, 12 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/dataloader_guide/</guid><description>从索引流、取样本、成 batch 三条线讲清 DataLoader 职责，涵盖 Sampler、collate_fn、num_workers、pin_memory 及与 Dataset 的衔接。</description></item><item><title>第三章：注意力机制（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/03_attention_mechanisms/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/03_attention_mechanisms/</guid><description>TorchCode 文档第三章：注意力机制从基础到高效实现。</description></item></channel></rss>