Releases: lucidrains/FLASH-pytorch
Releases · lucidrains/FLASH-pytorch
0.1.9
0.1.8
add ability to directly handle t5 relative positional bias within GAU
0.1.7
fix laplace activation function, thanks to @boweny-cerebras
0.1.6
add the laplace attention function, which the authors of Mega propose…
v0.1.5
make reducing group for context in noncausal linear attention a hyper…
v0.1.4
fix an error thanks to @ShomyLiu at https://github.com/lucidrains/FLA…
0.1.2
fix mask for linear attention
0.1.1
add token shift, for dramatic improvements, cite @BlinkDL
0.0.15
0.0.15a mask bug
0.0.14
fix