| lucidrains/vit-pytorch |
16,298 |
|
0 |
6 |
over 2 years ago |
184 |
November 15, 2023 |
114 |
mit |
Python |
| Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch |
| brightmart/text_classification |
7,628 |
|
0 |
0 |
over 2 years ago |
0 |
|
45 |
mit |
Python |
| all kinds of text classification models and more with deep learning |
| Kyubyong/transformer |
3,882 |
|
0 |
0 |
about 3 years ago |
0 |
|
134 |
apache-2.0 |
Python |
| A TensorFlow Implementation of the Transformer: Attention Is All You Need |
| lucidrains/x-transformers |
3,840 |
|
0 |
10 |
about 2 years ago |
317 |
December 02, 2023 |
55 |
mit |
Python |
| A simple but complete full-attention transformer with a set of promising experimental features from various papers |
| sgrvinod/a-PyTorch-Tutorial-to-Image-Captioning |
2,084 |
|
0 |
0 |
over 3 years ago |
0 |
|
97 |
mit |
Python |
| Show, Attend, and Tell | a PyTorch Tutorial to Image Captioning |
| PetarV-/GAT |
2,078 |
|
0 |
0 |
over 4 years ago |
0 |
|
27 |
mit |
Python |
| Graph Attention Networks (https://arxiv.org/abs/1710.10903) |
| lucidrains/reformer-pytorch |
1,917 |
|
0 |
0 |
almost 3 years ago |
139 |
November 06, 2021 |
14 |
mit |
Python |
| Reformer, the efficient Transformer, in Pytorch |
| gordicaleksa/pytorch-GAT |
1,815 |
|
0 |
0 |
over 3 years ago |
0 |
|
12 |
mit |
Jupyter Notebook |
| My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples! |
| Diego999/pyGAT |
1,684 |
|
0 |
0 |
over 4 years ago |
0 |
|
32 |
mit |
Python |
| Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903) |
| ml-jku/hopfield-layers |
1,258 |
|
0 |
0 |
about 4 years ago |
0 |
|
0 |
other |
Python |
| Hopfield Networks is All You Need |