Skip to content

Conversation

@Clement25
Copy link

@Clement25 Clement25 commented Aug 9, 2025

In the case that flash_attn_2 is not available.

Currently only add hijiack_llama, will add implementations for other models in a later time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

1 participant