| huaban/elasticsearch-analysis-jieba |
296 |
|
0 |
0 |
about 9 years ago |
0 |
|
10 |
|
Java |
| The plugin includes the `jieba` analyzer, `jieba` tokenizer, and `jieba` token filter, and have two mode you can choose. one is `index` which means it will be used when you want to index a document. another is `search` mode which used when you want to search something. |
| howl-anderson/MicroTokenizer |
119 |
|
1 |
1 |
over 4 years ago |
54 |
October 18, 2024 |
0 |
mit |
Python |
| 一个微型&算法全面的中文分词引擎 | A micro tokenizer for Chinese |
| DCjanus/cang-jie |
65 |
|
0 |
6 |
over 2 years ago |
20 |
November 04, 2023 |
0 |
mit |
Rust |
| Chinese tokenizer for tantivy, based on jieba-rs |
| frankee/sphinx-jieba |
18 |
|
0 |
0 |
almost 9 years ago |
0 |
|
4 |
gpl-2.0 |
C++ |
| sphinx search engine with jieba tokenizer |
| fantasy/cppjieba-py |
5 |
|
0 |
0 |
over 8 years ago |
0 |
|
0 |
mit |
Python |
| python extension for cppjieba |