| zhunzhong07/Random-Erasing |
697 |
|
0 |
0 |
over 2 years ago |
0 |
|
11 |
apache-2.0 |
Python |
| Random Erasing Data Augmentation. Experiments on CIFAR10, CIFAR100 and Fashion-MNIST |
| hwalsuklee/tensorflow-mnist-cnn |
171 |
|
0 |
0 |
over 7 years ago |
0 |
|
7 |
|
Python |
| MNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented. |
| HazyResearch/tanda |
157 |
|
0 |
0 |
about 4 years ago |
0 |
|
12 |
mit |
Python |
| Learning to Compose Domain-Specific Transformations for Data Augmentation |
| gnawice/mojo-cnn |
152 |
|
0 |
0 |
almost 8 years ago |
0 |
|
5 |
mit |
C++ |
| mojo cnn: c++ convolutional neural network |
| zaidalyafeai/Swift4TF |
130 |
|
0 |
0 |
about 6 years ago |
0 |
|
1 |
|
Jupyter Notebook |
| A set of notebooks explaining swift for tensorflow optimized to run in Google Collaboratory. |
| anokland/local-loss |
123 |
|
0 |
0 |
over 6 years ago |
0 |
|
3 |
|
Python |
| PyTorch code for training neural networks without global back-propagation |
| dlaptev/TI-pooling |
117 |
|
0 |
0 |
over 8 years ago |
0 |
|
0 |
other |
Python |
| TI-pooling: transformation-invariant pooling for feature learning in Convolutional Neural Networks |
| hwalsuklee/how-far-can-we-go-with-MNIST |
59 |
|
0 |
0 |
almost 9 years ago |
0 |
|
0 |
|
|
| A collection of codes for 'how far can we go with MNIST' challenge |
| COGMAR/RotEqNet |
43 |
|
0 |
0 |
almost 4 years ago |
0 |
|
0 |
|
Python |
| Rotational Equivariant Networks for PyTorch/Python |
| XifengGuo/DEC-DA |
33 |
|
0 |
0 |
about 7 years ago |
0 |
|
1 |
mit |
Python |
| Deep Embedded Clustering with Data Augmentation (DEC-DA). Performance on MNIST (acc=0.985, nmi=0.960). |