Installation Guide
This guide covers all the ways you can install and set up Axolotl for your environment.
1 Requirements
- NVIDIA GPU (Ampere architecture or newer for
bf16
and Flash Attention) or AMD GPU - Python ≥3.10
- PyTorch ≥2.4.1
2 Installation Methods
2.1 PyPI Installation (Recommended)
pip3 install --no-build-isolation axolotl[flash-attn,deepspeed]
We use --no-build-isolation
in order to detect the installed PyTorch version (if installed) in order not to clobber it, and so that we set the correct version of dependencies that are specific to the PyTorch version or other installed co-dependencies.
2.2 Edge/Development Build
For the latest features between releases:
git clone https://github.com/axolotl-ai-cloud/axolotl.git
cd axolotl
pip3 install packaging ninja
pip3 install --no-build-isolation -e '.[flash-attn,deepspeed]'
2.3 Docker
docker run --gpus '"all"' --rm -it axolotlai/axolotl:main-latest
For development with Docker:
docker compose up -d
docker run --privileged --gpus '"all"' --shm-size 10g --rm -it \
--name axolotl --ipc=host \
--ulimit memlock=-1 --ulimit stack=67108864 \
--mount type=bind,src="${PWD}",target=/workspace/axolotl \
-v ${HOME}/.cache/huggingface:/root/.cache/huggingface \
axolotlai/axolotl:main-latest
3 Cloud Environments
3.1 Cloud GPU Providers
For providers supporting Docker:
- Use
axolotlai/axolotl-cloud:main-latest
- Available on:
3.2 Google Colab
Use our example notebook.
4 Platform-Specific Instructions
4.1 macOS
pip3 install --no-build-isolation -e '.'
See Section 6 for Mac-specific issues.
4.2 Windows
We recommend using WSL2 (Windows Subsystem for Linux) or Docker.
5 Environment Managers
5.1 Conda/Pip venv
Install Python ≥3.10
Install PyTorch: https://pytorch.org/get-started/locally/
Install Axolotl:
pip3 install packaging pip3 install --no-build-isolation -e '.[flash-attn,deepspeed]'
(Optional) Login to Hugging Face:
huggingface-cli login
6 Troubleshooting
If you encounter installation issues, see our FAQ and Debugging Guide.