2024-08-28 17:21:56 +08:00
2024-08-28 15:18:15 +08:00
2024-08-28 15:18:15 +08:00
2024-08-28 15:18:15 +08:00
2024-08-28 15:18:15 +08:00
2024-08-28 17:21:56 +08:00
2024-08-28 15:18:15 +08:00
add
2024-08-28 17:18:03 +08:00
2024-08-28 15:18:15 +08:00
2024-08-28 15:18:15 +08:00

预训练

GitHub - huggingface/transformers: 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. GitHub - microsoft/DeepSpeed: DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective. GitHub - huggingface/peft: 🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning. GitHub - huggingface/accelerate: 🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support

# torch 
https://github.com/microsoft/DeepSpeed/blob/master/docker/Dockerfile
https://github.com/huggingface/transformers/blob/main/docker/transformers-all-latest-gpu/Dockerfile
https://github.com/huggingface/peft/tree/main/docker/peft-gpu-bnb-source
https://github.com/huggingface/accelerate/blob/main/docker/accelerate-gpu/Dockerfile
Description
No description provided
Readme 415 KiB
Languages
Dockerfile 94.2%
Shell 5.8%