16 lines
969 B
Markdown
16 lines
969 B
Markdown
|
|
|
|
## 预训练
|
|
|
|
GitHub - huggingface/transformers: 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
|
|
GitHub - microsoft/DeepSpeed: DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
|
|
GitHub - huggingface/peft: 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
|
|
GitHub - huggingface/accelerate: 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
|
|
|
|
```shell
|
|
# torch
|
|
https://github.com/microsoft/DeepSpeed/blob/master/docker/Dockerfile
|
|
https://github.com/huggingface/transformers/blob/main/docker/transformers-all-latest-gpu/Dockerfile
|
|
https://github.com/huggingface/peft/tree/main/docker/peft-gpu-bnb-source
|
|
https://github.com/huggingface/accelerate/blob/main/docker/accelerate-gpu/Dockerfile
|
|
``` |