<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>Gpt on Jeanphilo Blog</title><link>https://shio-chan-dev.github.io/jeanblog/zh/tags/gpt/</link><description>Recent content in Gpt on Jeanphilo Blog</description><generator>Hugo -- 0.159.2</generator><language>zh-cn</language><lastBuildDate>Sat, 24 Jan 2026 16:15:34 +0800</lastBuildDate><atom:link href="https://shio-chan-dev.github.io/jeanblog/zh/tags/gpt/index.xml" rel="self" type="application/rss+xml"/><item><title>为什么 GPT 是 Decoder-Only：自回归生成的最佳形态</title><link>https://shio-chan-dev.github.io/jeanblog/zh/ai/llm/why-gpt-decoder-only/</link><pubDate>Sat, 24 Jan 2026 16:15:34 +0800</pubDate><guid>https://shio-chan-dev.github.io/jeanblog/zh/ai/llm/why-gpt-decoder-only/</guid><description>解释 GPT 选择 decoder-only 结构的原因，并与 encoder-only / encoder-decoder 做工程对比。</description></item><item><title>BERT vs GPT：预训练任务与应用差异</title><link>https://shio-chan-dev.github.io/jeanblog/zh/ai/llm/bert-vs-gpt-pretraining-objectives/</link><pubDate>Sat, 24 Jan 2026 16:12:12 +0800</pubDate><guid>https://shio-chan-dev.github.io/jeanblog/zh/ai/llm/bert-vs-gpt-pretraining-objectives/</guid><description>对比 BERT 与 GPT 的预训练目标、架构假设与工程场景，并给出最小可运行示例。</description></item></channel></rss>