| labmlai/annotated_deep_learning_paper_implementations |
41,877 |
|
0 |
2 |
over 2 years ago |
79 |
November 05, 2023 |
30 |
mit |
Jupyter Notebook |
| 🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠 |
| jadore801120/attention-is-all-you-need-pytorch |
7,910 |
|
0 |
0 |
over 2 years ago |
0 |
|
74 |
mit |
Python |
| A PyTorch implementation of the Transformer model in "Attention is All You Need". |
| espnet/espnet |
7,563 |
|
0 |
5 |
about 2 years ago |
33 |
October 25, 2023 |
270 |
apache-2.0 |
Python |
| End-to-End Speech Processing Toolkit |
| zhouhaoyi/Informer2020 |
4,553 |
|
0 |
0 |
over 2 years ago |
0 |
|
128 |
apache-2.0 |
Python |
| The GitHub repository for the paper "Informer" accepted by AAAI 2021. |
| lucidrains/x-transformers |
3,840 |
|
0 |
10 |
about 2 years ago |
317 |
December 02, 2023 |
55 |
mit |
Python |
| A simple but complete full-attention transformer with a set of promising experimental features from various papers |
| google-research/scenic |
2,733 |
|
0 |
0 |
about 2 years ago |
0 |
|
213 |
apache-2.0 |
Python |
| Scenic: A Jax Library for Computer Vision Research and Beyond |
| danielegrattarola/spektral |
2,317 |
|
0 |
6 |
about 2 years ago |
34 |
June 01, 2023 |
67 |
mit |
Python |
| Graph Neural Networks with Keras and Tensorflow 2. |
| gordicaleksa/pytorch-GAT |
1,815 |
|
0 |
0 |
over 3 years ago |
0 |
|
12 |
mit |
Jupyter Notebook |
| My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples! |
| iscyy/yoloair |
1,714 |
|
0 |
0 |
almost 3 years ago |
0 |
|
30 |
gpl-3.0 |
Python |
| 🔥🔥🔥YOLOv5, YOLOv6, YOLOv7, YOLOv8, PPYOLOE, YOLOX, YOLOR, YOLOv4, YOLOv3, Transformer, Attention, TOOD and Improved-YOLOv5-YOLOv7... Support to improve backbone, neck, head, loss, IoU, NMS and other modules🚀 |
| microsoft/DeBERTa |
1,673 |
|
0 |
0 |
over 2 years ago |
13 |
February 09, 2021 |
63 |
mit |
Python |
| The implementation of DeBERTa |