You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I use MultiheadAttention in my model, how do I bring in agent-attention? It's best to do it without retraining the model. My idea is to replace the computational part of q,k,v in its forward propagation, is that possible?
The text was updated successfully, but these errors were encountered:
Hello, I use MultiheadAttention in my model, how do I bring in agent-attention? It's best to do it without retraining the model. My idea is to replace the computational part of q,k,v in its forward propagation, is that possible?
The text was updated successfully, but these errors were encountered: