Update 4.0 基于Hugging Face -Transformers的预训练模型微调.md
This commit is contained in:
parent
5a81fb57f9
commit
902cea800d
|
@ -125,8 +125,8 @@ Encoder-decoder | BART, T5, Marian, mBART |摘要生成、翻译、生成式问
|
|||
>或单击[Open in Colab](https://colab.research.google.com/github/huggingface/notebooks/blob/master/course/chapter1/section3.ipynb)以打开包含其它管道应用代码示例的 Google Colab 笔记本。
|
||||
如果您想在本地运行示例,我们建议您查看[设置](https://huggingface.co/course/chapter0)。
|
||||
## 3. Behind the pipeline
|
||||
>本节代码[Open in Colab](https://colab.research.google.com/github/huggingface/notebooks/blob/master/course/chapter2/section2_pt.ipynb) (PyTorch)
|
||||
[YouTube视频:what happend inside the pipeline function](https://youtu.be/1pedAIvTWXk)
|
||||
>本节代码:[Open in Colab](https://colab.research.google.com/github/huggingface/notebooks/blob/master/course/chapter2/section2_pt.ipynb) (PyTorch)<br>
|
||||
>YouTube视频:[what happend inside the pipeline function](https://youtu.be/1pedAIvTWXk)
|
||||
|
||||
让我们从一个完整的例子开始,看看当我们在第1节中执行以下代码时,幕后发生了什么:
|
||||
```python
|
||||
|
|
Loading…
Reference in New Issue