| jindongwang/transferlearning |
12,494 |
|
0 |
0 |
about 2 years ago |
0 |
|
14 |
mit |
Python |
| Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习 |
| yaoyao-liu/meta-transfer-learning |
691 |
|
0 |
0 |
over 3 years ago |
0 |
|
37 |
mit |
Python |
| TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019) |
| pykale/pykale |
415 |
|
0 |
0 |
about 2 years ago |
12 |
April 12, 2022 |
9 |
mit |
Python |
| Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem. ⭐ Star to support our work! |
| mbs0221/Multitask-Learning |
374 |
|
0 |
0 |
about 5 years ago |
0 |
|
2 |
|
|
| Awesome Multitask Learning Resources |
| mims-harvard/G-Meta |
96 |
|
0 |
0 |
over 3 years ago |
0 |
|
3 |
|
Python |
| Graph meta learning via local subgraphs (NeurIPS 2020) |
| ThyrixYang/awesome-artificial-intelligence-research |
80 |
|
0 |
0 |
over 3 years ago |
0 |
|
0 |
|
|
| A curated list of Artificial Intelligence (AI) Research, tracks the cutting edge trending of AI research, including recommender systems, computer vision, machine learning, etc. |
| densechen/AReLU |
58 |
|
0 |
1 |
over 4 years ago |
1 |
January 23, 2021 |
1 |
mit |
Jupyter Notebook |
| AReLU: Attention-based-Rectified-Linear-Unit |
| ewanlee/ICLR2019-RL-Papers |
37 |
|
0 |
0 |
almost 7 years ago |
0 |
|
0 |
|
|
| The Reinforcement-Learning-Related Papers of ICLR 2019 |
| nayeemrizve/invariance-equivariance |
33 |
|
0 |
0 |
about 3 years ago |
0 |
|
1 |
mit |
Python |
| "Exploring Complementary Strengths of Invariant and Equivariant Representations for Few-Shot Learning" by Mamshad Nayeem Rizve, Salman Khan, Fahad Shahbaz Khan, Mubarak Shah (CVPR 2021) |
| MichaelMMeskhi/meta-learning-progress |
26 |
|
0 |
0 |
over 4 years ago |
0 |
|
2 |
gpl-3.0 |
CSS |
| Repository to track the progress in Meta-Learning (MtL), including the datasets and the current state-of-the-art for the most common MtL problems. |