| huggingface/transformers |
119,240 |
|
64 |
2,484 |
about 2 years ago |
125 |
November 15, 2023 |
946 |
apache-2.0 |
Python |
| 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. |
| JohnSnowLabs/spark-nlp |
3,578 |
|
0 |
30 |
about 2 years ago |
134 |
December 08, 2023 |
43 |
apache-2.0 |
Scala |
| State of the Art Natural Language Processing |
| CLUEbenchmark/CLUE |
3,345 |
|
0 |
0 |
almost 3 years ago |
0 |
|
73 |
|
Python |
| 中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard |
| tensorflow/lingvo |
2,776 |
|
0 |
0 |
about 2 years ago |
0 |
|
133 |
apache-2.0 |
Python |
| Lingvo |
| NVIDIA/OpenSeq2Seq |
1,393 |
|
0 |
0 |
almost 5 years ago |
0 |
|
85 |
apache-2.0 |
Python |
| Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP |
| ymcui/Chinese-ELECTRA |
1,253 |
|
0 |
0 |
about 3 years ago |
0 |
|
0 |
apache-2.0 |
Python |
| Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型) |
| ashishpatel26/Treasure-of-Transformers |
541 |
|
0 |
0 |
over 3 years ago |
0 |
|
0 |
mit |
Jupyter Notebook |
| 💁 Awesome Treasure of Transformers Models for Natural Language processing contains papers, videos, blogs, official repo along with colab Notebooks. 🛫☑️ |
| ymcui/MacBERT |
489 |
|
0 |
0 |
about 3 years ago |
0 |
|
0 |
apache-2.0 |
|
| Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT) |
| monologg/KoELECTRA |
448 |
|
0 |
0 |
about 4 years ago |
0 |
|
0 |
apache-2.0 |
Python |
| Pretrained ELECTRA Model for Korean |
| uzaymacar/attention-mechanisms |
294 |
|
0 |
0 |
over 4 years ago |
0 |
|
2 |
mit |
Python |
| Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras. |