monkeypatch.attention.mllama
monkeypatch.attention.mllama
Monkeypatch for Vision Llama for FA2 support
Classes
Name | Description |
---|---|
MllamaTextCrossFlashAttention2 | Mllama flash cross-attention module. This module inherits from MllamaTextCrossAttention and |
MllamaTextSelfFlashAttention2 | Mllama flash self-attention module. This module inherits from MllamaTextSelfAttention and |
MllamaTextCrossFlashAttention2
monkeypatch.attention.mllama.MllamaTextCrossFlashAttention2(self,
*args,
**kwargs,
)
Mllama flash cross-attention module. This module inherits from MllamaTextCrossAttention
and
implements the forward pass using Flash Attention for improved performance.
MllamaTextSelfFlashAttention2
monkeypatch.attention.mllama.MllamaTextSelfFlashAttention2(self,
config,
layer_idx,*args,
**kwargs,
)
Mllama flash self-attention module. This module inherits from MllamaTextSelfAttention
and
implements the forward pass using Flash Attention for improved performance.