From 902cea800ded3976b5b5a8b1e72ac97f156ef6cc Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E8=80=81=E5=BC=A0?= Date: Sun, 12 Sep 2021 21:47:40 +0800 Subject: [PATCH] =?UTF-8?q?Update=204.0=20=E5=9F=BA=E4=BA=8EHugging=20Face?= =?UTF-8?q?=20-Transformers=E7=9A=84=E9=A2=84=E8=AE=AD=E7=BB=83=E6=A8=A1?= =?UTF-8?q?=E5=9E=8B=E5=BE=AE=E8=B0=83.md?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- .../4.0 基于Hugging Face -Transformers的预训练模型微调.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/篇章4-使用Transformers解决NLP任务/4.0 基于Hugging Face -Transformers的预训练模型微调.md b/docs/篇章4-使用Transformers解决NLP任务/4.0 基于Hugging Face -Transformers的预训练模型微调.md index abb0e64..be07991 100644 --- a/docs/篇章4-使用Transformers解决NLP任务/4.0 基于Hugging Face -Transformers的预训练模型微调.md +++ b/docs/篇章4-使用Transformers解决NLP任务/4.0 基于Hugging Face -Transformers的预训练模型微调.md @@ -125,8 +125,8 @@ Encoder-decoder | BART, T5, Marian, mBART |摘要生成、翻译、生成式问 >或单击[Open in Colab](https://colab.research.google.com/github/huggingface/notebooks/blob/master/course/chapter1/section3.ipynb)以打开包含其它管道应用代码示例的 Google Colab 笔记本。 如果您想在本地运行示例,我们建议您查看[设置](https://huggingface.co/course/chapter0)。 ## 3. Behind the pipeline ->本节代码[Open in Colab](https://colab.research.google.com/github/huggingface/notebooks/blob/master/course/chapter2/section2_pt.ipynb) (PyTorch) -[YouTube视频:what happend inside the pipeline function](https://youtu.be/1pedAIvTWXk) +>本节代码:[Open in Colab](https://colab.research.google.com/github/huggingface/notebooks/blob/master/course/chapter2/section2_pt.ipynb) (PyTorch)
+>YouTube视频:[what happend inside the pipeline function](https://youtu.be/1pedAIvTWXk) 让我们从一个完整的例子开始,看看当我们在第1节中执行以下代码时,幕后发生了什么: ```python