<rss xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title>Activation Function - 标签 - mywebsite</title><link>https://steven-yl.github.io/mywebsite/tags/activation-function/</link><description>Activation Function - 标签 - mywebsite</description><generator>Hugo -- gohugo.io</generator><language>zh-CN</language><managingEditor>steven@gmail.com (Steven)</managingEditor><webMaster>steven@gmail.com (Steven)</webMaster><copyright>This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.</copyright><lastBuildDate>Tue, 24 Mar 2026 00:00:00 +0800</lastBuildDate><atom:link href="https://steven-yl.github.io/mywebsite/tags/activation-function/" rel="self" type="application/rss+xml"/><item><title>PyTorch 激活函数</title><link>https://steven-yl.github.io/mywebsite/active_function/</link><pubDate>Tue, 24 Mar 2026 00:00:00 +0800</pubDate><author><name>Steven</name><uri>https://github.com/steven-yl</uri></author><guid>https://steven-yl.github.io/mywebsite/active_function/</guid><description>本文汇总 Sigmoid、Tanh、ReLU、GELU、Swish 等激活函数，并提供分组图与总览图。</description></item></channel></rss>