| philipperemy/keras-attention |
2,791 |
|
0 |
0 |
over 2 years ago |
0 |
|
2 |
apache-2.0 |
Python |
| Keras Attention Layer (Luong and Bahdanau scores). |
| PetarV-/GAT |
2,078 |
|
0 |
0 |
over 4 years ago |
0 |
|
27 |
mit |
Python |
| Graph Attention Networks (https://arxiv.org/abs/1710.10903) |
| lucidrains/lambda-networks |
1,110 |
|
0 |
0 |
over 5 years ago |
11 |
November 18, 2020 |
8 |
mit |
Python |
| Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute |
| datalogue/keras-attention |
656 |
|
0 |
0 |
almost 7 years ago |
0 |
|
22 |
agpl-3.0 |
Python |
| Visualizing RNNs using the attention mechanism |
| CyberZHG/keras-self-attention |
570 |
|
11 |
9 |
about 4 years ago |
43 |
January 22, 2022 |
0 |
mit |
Python |
| Attention mechanism for processing sequential data that considers the context for each timestamp. |
| lvapeab/nmt-keras |
514 |
|
0 |
0 |
over 4 years ago |
0 |
|
4 |
mit |
Python |
| Neural Machine Translation with Keras |
| danielegrattarola/keras-gat |
301 |
|
0 |
0 |
over 5 years ago |
0 |
|
3 |
mit |
Python |
| Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903) |
| uzaymacar/attention-mechanisms |
294 |
|
0 |
0 |
over 4 years ago |
0 |
|
2 |
mit |
Python |
| Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras. |
| cbaziotis/datastories-semeval2017-task4 |
171 |
|
0 |
0 |
almost 8 years ago |
0 |
|
8 |
mit |
Python |
| Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis". |
| AlexGidiotis/Document-Classifier-LSTM |
167 |
|
0 |
0 |
over 2 years ago |
0 |
|
3 |
mit |
Python |
| A bidirectional LSTM with attention for multiclass/multilabel text classification. |