| tysam-code/hlb-CIFAR10 |
1,112 |
|
0 |
0 |
over 2 years ago |
0 |
|
1 |
apache-2.0 |
Python |
| Train to 94% on CIFAR-10 in <6.3 seconds on a single A100, the current world speed record. Or ~95.79% in ~110 seconds (or less!) |
| Hyperparticle/one-pixel-attack-keras |
1,078 |
|
0 |
0 |
over 5 years ago |
0 |
|
4 |
mit |
Jupyter Notebook |
| Keras implementation of "One pixel attack for fooling deep neural networks" using differential evolution on Cifar10 and ImageNet |
| BIGBALLON/cifar-10-cnn |
726 |
|
0 |
0 |
over 5 years ago |
0 |
|
4 |
mit |
Python |
| Play deep learning with CIFAR datasets |
| huyvnphan/PyTorch_CIFAR10 |
520 |
|
0 |
0 |
almost 3 years ago |
0 |
|
1 |
mit |
Python |
| Pretrained TorchVision models on CIFAR10 dataset (with weights) |
| PacktPublishing/Deep-Learning-with-TensorFlow-2-and-Keras |
299 |
|
0 |
0 |
over 2 years ago |
0 |
|
7 |
mit |
Jupyter Notebook |
| Deep Learning with TensorFlow 2 and Keras, published by Packt |
| wy1iu/LargeMargin_Softmax_Loss |
297 |
|
0 |
0 |
over 7 years ago |
0 |
|
10 |
other |
C++ |
| Implementation for <Large-Margin Softmax Loss for Convolutional Neural Networks> in ICML'16. |
| naszilla/naszilla |
269 |
|
0 |
0 |
over 3 years ago |
1 |
November 15, 2020 |
3 |
apache-2.0 |
Python |
| Naszilla is a Python library for neural architecture search (NAS) |
| stanford-futuredata/dawn-bench-entries |
235 |
|
0 |
0 |
almost 6 years ago |
0 |
|
1 |
|
Python |
| DAWNBench: An End-to-End Deep Learning Benchmark and Competition |
| xternalz/WideResNet-pytorch |
206 |
|
0 |
0 |
over 6 years ago |
0 |
|
2 |
mit |
Python |
| Wide Residual Networks (WideResNets) in PyTorch |
| chenyaofo/pytorch-cifar-models |
174 |
|
0 |
0 |
about 3 years ago |
0 |
|
0 |
bsd-3-clause |
Python |
| Pretrained models on CIFAR10/100 in PyTorch |