<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title>Deep Learning - 分类 - mywebsite</title><link>https://steven-yl.github.io/mywebsite/categories/deep-learning/</link><description>Deep Learning - 分类 - mywebsite</description><generator>Hugo -- gohugo.io</generator><language>zh-CN</language><managingEditor>steven@gmail.com (Steven)</managingEditor><webMaster>steven@gmail.com (Steven)</webMaster><copyright>This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.</copyright><lastBuildDate>Wed, 01 Apr 2026 10:00:00 +0800</lastBuildDate><atom:link href="https://steven-yl.github.io/mywebsite/categories/deep-learning/" rel="self" type="application/rss+xml"/><item><title>深度学习中的常见归一化方法</title><link>https://steven-yl.github.io/mywebsite/norm/</link><pubDate>Wed, 01 Apr 2026 10:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/norm/</guid><description>深度学习中的常见归一化方法</description></item><item><title>KL 散度与离散流匹配中的广义 KL 损失</title><link>https://steven-yl.github.io/mywebsite/kl_div/</link><pubDate>Wed, 25 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/kl_div/</guid><description>本文把 KL 散度相关的几个核心概念串起来，给出离散流匹配中广义 KL 损失的直观解释与 PyTorch 实现示例。</description></item><item><title>Loss Functions：系统化整理</title><link>https://steven-yl.github.io/mywebsite/loss_type/</link><pubDate>Wed, 25 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/loss_type/</guid><description>本笔记从任务视角覆盖主流 Loss Functions，包括经典方法、现代变体以及实际组合策略，便于快速对照与选型。</description></item></channel></rss>