| koomri/text-segmentation |
73 |
|
0 |
0 |
over 6 years ago |
0 |
|
3 |
|
Python |
| Implementation of the paper: Text Segmentation as a Supervised Learning Task |
| google-research-datasets/wiki-split |
72 |
|
0 |
0 |
almost 7 years ago |
0 |
|
2 |
|
|
| One million English sentences, each split into two sentences that together preserve the original meaning, extracted from Wikipedia edits. |
| songjiang0909/awesome-knowledge-graph-construction |
69 |
|
0 |
0 |
over 4 years ago |
0 |
|
0 |
mit |
|
| yairf11/MUPPET |
29 |
|
0 |
0 |
almost 6 years ago |
0 |
|
1 |
other |
Python |
| Code for the paper "multi-hop paragraph retrieval for open-domain question answering" |
| wpoa/OA-signalling |
19 |
|
0 |
0 |
about 9 years ago |
0 |
|
80 |
gpl-3.0 |
Jupyter Notebook |
| A project to coordinate implementing a system to signal whether references cited on Wikipedia are free to reuse |
| malteos/semantic-document-relations |
18 |
|
0 |
0 |
about 5 years ago |
0 |
|
0 |
mit |
Python |
| Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles" |
| jbcdnr/gretel-path-extrapolation |
15 |
|
0 |
0 |
almost 6 years ago |
0 |
|
0 |
apache-2.0 |
Python |
| Implementation for the paper "Extrapolating paths with graph neural networks" |
| EthanZhu90/ZSL_PP_CVPR17 |
12 |
|
0 |
0 |
over 6 years ago |
0 |
|
0 |
mit |
MATLAB |
| Code for the paper CVPR‘17 “Zero Shot Learning from Noisy Text Description at Part Precision” |
| mirrys/citation-needed-paper |
9 |
|
0 |
0 |
about 6 years ago |
0 |
|
5 |
|
Python |
| Repository of data and code to use the models described in the paper "Citation Needed: A Taxonomy and Algorithmic Assessment of Wikipedia's Verifiability" |
| jacopofar/wikipedia-category-graph |
9 |
|
0 |
0 |
over 11 years ago |
0 |
|
0 |
apache-2.0 |
Java |
| An implementation of the paper "Automatically assigning Wikipedia articles to macro-categories" |