| lucidrains/x-transformers |
3,840 |
|
0 |
10 |
about 2 years ago |
317 |
December 02, 2023 |
55 |
mit |
Python |
| A simple but complete full-attention transformer with a set of promising experimental features from various papers |
| Separius/awesome-fast-attention |
717 |
|
0 |
0 |
over 4 years ago |
0 |
|
0 |
gpl-3.0 |
Python |
| list of efficient attention modules |
| bhoov/exbert |
541 |
|
0 |
0 |
over 2 years ago |
0 |
|
18 |
apache-2.0 |
Python |
| A Visual Analysis Tool to Explore Learned Representations in Transformers Models |
| clarkkev/attention-analysis |
309 |
|
0 |
0 |
about 5 years ago |
0 |
|
10 |
mit |
Jupyter Notebook |
| rusiaaman/XLnet-gen |
166 |
|
0 |
0 |
about 5 years ago |
0 |
|
11 |
mit |
Python |
| XLNet for generating language. |
| brave-intl/basic-attention-token-crowdsale |
156 |
|
0 |
0 |
about 5 years ago |
0 |
|
1 |
|
TeX |
| Basic Attention Token |
| spro/RARNN |
57 |
|
0 |
0 |
almost 9 years ago |
0 |
|
0 |
mit |
Jupyter Notebook |
| Recursive application of recurrent neural networks, for hierarchical intent parsing |
| BATgrowth/detect-bat-publishers |
17 |
|
0 |
2 |
about 7 years ago |
5 |
July 10, 2018 |
5 |
mit |
JavaScript |
| DEPRECATED: A npm package to detect Brave Browser / Basic Attention Token publishers |
| DoranLyong/Awesome-TokenMixer-pytorch |
7 |
|
0 |
0 |
about 2 years ago |
0 |
|
0 |
|
Python |
| Pytorch implementation of various token mixers; Attention Mechanisms, MLP, and etc for understanding computer vision papers and other tasks. |
| skyail/skyail |
5 |
|
0 |
0 |
almost 5 years ago |
4 |
January 10, 2021 |
0 |
gpl-3.0 |
Java |
| a lightweight , pay more attention to detail , apply to a variety of scenarios java project |