An open API service providing commit metadata for open source projects.

GitHub / lucidrains/light-recurrent-unit-pytorch / commits

Implementation of a Light Recurrent Unit in Pytorch

SHA Message Author Date Stats
e05feb41 get char level enwik8 training in there lucidrains <l****s@g****m> about 1 year ago
3541042f address https://github.com/lucidrains/light-recurrent-unit-pytorch/issues/3 lucidrains <l****s@g****m> about 1 year ago
25da1a08 allow for hidden state to be fed back for recurrent units lucidrains <l****s@g****m> about 1 year ago
46396c55 allow for replicating eq 7 in the paper, even if personally prefer gated lru ... lucidrains <l****s@g****m> about 1 year ago
e5d2b847 some more layers per depth for gated variant lucidrains <l****s@g****m> about 1 year ago
6b249194 go off script lucidrains <l****s@g****m> about 1 year ago
0693a511 able to use learned initial hidden lucidrains <l****s@g****m> about 1 year ago
7cbd2cf6 feedforward block for good measure lucidrains <l****s@g****m> about 1 year ago
428307a2 add a light recurrent unit block, so it can be fit into a transformer lucidrains <l****s@g****m> about 1 year ago
c6b3568d good enough for a few experiments Phil Wang <l****s@g****m>
Committed by: GitHub <n****y@g****m>
about 1 year ago
32b70557 drop dep lucidrains <l****s@g****m> about 1 year ago
7c6786e3 simple stacked lru lucidrains <l****s@g****m> about 1 year ago
1ba1157e fixes lucidrains <l****s@g****m> about 1 year ago
f3765389 work up to the stacked lru lucidrains <l****s@g****m> about 1 year ago
fc382c29 complete one cell lucidrains <l****s@g****m> about 1 year ago
8e5e7fab add figure 2 lucidrains <l****s@g****m> about 1 year ago
aedaf388 scaffold lucidrains <l****s@g****m> about 1 year ago
3a9d327c readme lucidrains <l****s@g****m> about 1 year ago
3d1c068a scaffold lucidrains <l****s@g****m> about 1 year ago
090839ce Initial commit Phil Wang <l****s@g****m>
Committed by: GitHub <n****y@g****m>
about 1 year ago

← Back to repository