| NLP-LOVE/ML-NLP |
10,874 |
|
0 |
0 |
over 4 years ago |
0 |
|
29 |
|
Jupyter Notebook |
| 此项目是机器学习(Machine Learning)、深度学习(Deep Learning)、NLP面试中常考到的知识点和代码实现,也是作为一个算法工程师必会的理论基础知识。 |
| BlinkDL/RWKV-LM |
10,705 |
|
0 |
0 |
about 2 years ago |
0 |
|
60 |
apache-2.0 |
Python |
| RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding. |
| lucidrains/PaLM-rlhf-pytorch |
7,496 |
|
0 |
0 |
about 2 years ago |
69 |
January 26, 2023 |
14 |
mit |
Python |
| Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM |
| lucidrains/DALLE-pytorch |
5,383 |
|
0 |
0 |
over 2 years ago |
174 |
May 30, 2022 |
125 |
mit |
Python |
| Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch |
| cmhungsteve/Awesome-Transformer-Attention |
3,895 |
|
0 |
0 |
about 2 years ago |
0 |
|
15 |
|
|
| An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites |
| lucidrains/x-transformers |
3,840 |
|
0 |
10 |
about 2 years ago |
317 |
December 02, 2023 |
55 |
mit |
Python |
| A simple but complete full-attention transformer with a set of promising experimental features from various papers |
| philipperemy/keras-attention |
2,791 |
|
0 |
0 |
over 2 years ago |
0 |
|
2 |
apache-2.0 |
Python |
| Keras Attention Layer (Luong and Bahdanau scores). |
| lucidrains/musiclm-pytorch |
2,686 |
|
0 |
0 |
over 2 years ago |
38 |
September 06, 2023 |
18 |
mit |
Python |
| Implementation of MusicLM, Google's new SOTA model for music generation using attention networks, in Pytorch |
| lucidrains/audiolm-pytorch |
2,112 |
|
0 |
5 |
over 2 years ago |
311 |
December 02, 2023 |
36 |
mit |
Python |
| Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch |
| gordicaleksa/pytorch-GAT |
1,815 |
|
0 |
0 |
over 3 years ago |
0 |
|
12 |
mit |
Jupyter Notebook |
| My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples! |