<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title>TorchCode - 标签 - mywebsite</title><link>https://steven-yl.github.io/mywebsite/tags/torchcode/</link><description>TorchCode - 标签 - mywebsite</description><generator>Hugo -- gohugo.io</generator><language>zh-CN</language><managingEditor>steven@gmail.com (Steven)</managingEditor><webMaster>steven@gmail.com (Steven)</webMaster><copyright>This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.</copyright><lastBuildDate>Wed, 01 Apr 2026 10:00:00 +0800</lastBuildDate><atom:link href="https://steven-yl.github.io/mywebsite/tags/torchcode/" rel="self" type="application/rss+xml"/><item><title>总览：TorchCode 知识架构与学习路径</title><link>https://steven-yl.github.io/mywebsite/00_overview/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/00_overview/</guid><description>算子级练习项目的整体架构与推荐阅读顺序。</description></item><item><title>第一章：激活函数与基础组件（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/01_activations_and_fundamentals/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/01_activations_and_fundamentals/</guid><description>TorchCode 文档第一章：从最底层算子到损失函数。</description></item><item><title>第二章：归一化技术（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/02_normalization/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/02_normalization/</guid><description>TorchCode 文档第二章：归一化技术全解。</description></item><item><title>第三章：注意力机制（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/03_attention_mechanisms/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/03_attention_mechanisms/</guid><description>TorchCode 文档第三章：注意力机制从基础到高效实现。</description></item><item><title>第四章：架构与模型组件（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/04_architectures/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/04_architectures/</guid><description>TorchCode 文档第四章：从组件到完整模型块。</description></item><item><title>第五章：训练与优化（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/05_training_optimization/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/05_training_optimization/</guid><description>TorchCode 文档第五章：训练与优化实践。</description></item><item><title>第六章：推理与解码策略（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/06_inference_decoding/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/06_inference_decoding/</guid><description>TorchCode 文档第六章：推理与解码。</description></item><item><title>第七章：高级主题（TorchCode）</title><link>https://steven-yl.github.io/mywebsite/07_advanced_topics/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/07_advanced_topics/</guid><description>TorchCode 文档第七章：分词、量化与 RLHF 损失。</description></item><item><title>TorchCode 技术文档索引</title><link>https://steven-yl.github.io/mywebsite/readme/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/readme/</guid><description>从零实现算子的练习项目配套文档索引，链向总览与各章详解。</description></item></channel></rss>