| ddbourgin/numpy-ml |
14,162 |
|
0 |
0 |
over 2 years ago |
3 |
June 20, 2020 |
35 |
gpl-3.0 |
Python |
| Machine learning, in numpy |
| danielegrattarola/spektral |
2,317 |
|
0 |
6 |
about 2 years ago |
34 |
June 01, 2023 |
67 |
mit |
Python |
| Graph Neural Networks with Keras and Tensorflow 2. |
| pprp/SimpleCVReproduction |
1,021 |
|
0 |
0 |
over 3 years ago |
0 |
|
3 |
apache-2.0 |
Jupyter Notebook |
| Replication of simple CV Projects including attention, classification, detection, keypoint detection, etc. |
| epfml/attention-cnn |
550 |
|
0 |
0 |
almost 6 years ago |
0 |
|
0 |
apache-2.0 |
Python |
| Source code for "On the Relationship between Self-Attention and Convolutional Layers" |
| kaijieshi7/Dynamic-convolution-Pytorch |
352 |
|
0 |
0 |
almost 4 years ago |
0 |
|
13 |
|
Python |
| Pytorch!!!Pytorch!!!Pytorch!!! Dynamic Convolution: Attention over Convolution Kernels (CVPR-2020) |
| hanxiao/tf-nlp-blocks |
231 |
|
0 |
0 |
about 7 years ago |
0 |
|
2 |
mit |
Python |
| Some frequently used NLP blocks I implemented |
| icoxfog417/graph-convolution-nlp |
200 |
|
0 |
0 |
about 6 years ago |
0 |
|
1 |
mit |
Jupyter Notebook |
| Graph Convolution Network for NLP |
| akanimax/fagan |
100 |
|
0 |
0 |
over 7 years ago |
0 |
|
1 |
mit |
Python |
| A variant of the Self Attention GAN named: FAGAN (Full Attention GAN) |
| BangLiu/QANet-PyTorch |
84 |
|
0 |
0 |
over 7 years ago |
0 |
|
2 |
mit |
Python |
| Re-implement "QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension" |
| yanx27/GACNet |
82 |
|
0 |
0 |
about 7 years ago |
0 |
|
7 |
|
Python |
| Pytorch implementation of 'Graph Attention Convolution for Point Cloud Segmentation' |