monkeypatch.llama_patch_multipack

monkeypatch.llama_patch_multipack

Patched LlamaAttention to use torch.nn.functional.scaled_dot_product_attention