Skip to content

Releases: lucidrains/FLASH-pytorch

0.1.9

26 Sep 00:14
Compare
Choose a tag to compare
address https://github.com/lucidrains/FLASH-pytorch/issues/12

0.1.8

12 May 18:09
Compare
Choose a tag to compare
add ability to directly handle t5 relative positional bias within GAU

0.1.7

19 Feb 04:13
Compare
Choose a tag to compare
fix laplace activation function, thanks to @boweny-cerebras

0.1.6

23 Sep 20:08
Compare
Choose a tag to compare
add the laplace attention function, which the authors of Mega propose…

v0.1.5

19 Jun 16:20
Compare
Choose a tag to compare
make reducing group for context in noncausal linear attention a hyper…

v0.1.4

18 Jun 18:55
Compare
Choose a tag to compare
fix an error thanks to @ShomyLiu at https://github.com/lucidrains/FLA…

0.1.2

08 Apr 02:29
Compare
Choose a tag to compare
fix mask for linear attention

0.1.1

29 Mar 16:12
Compare
Choose a tag to compare
add token shift, for dramatic improvements, cite @BlinkDL

0.0.15

29 Mar 06:20
Compare
Choose a tag to compare
0.0.15a

mask bug

0.0.14

29 Mar 05:56
Compare
Choose a tag to compare
fix