Instructions to use HPLT/hplt_bert_base_is with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use HPLT/hplt_bert_base_is with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="HPLT/hplt_bert_base_is", trust_remote_code=True)# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("HPLT/hplt_bert_base_is", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Upload tokenizer.json with huggingface_hub
Browse files- tokenizer.json +2 -2
tokenizer.json
CHANGED
|
@@ -991,7 +991,7 @@
|
|
| 991 |
{
|
| 992 |
"type": "Metaspace",
|
| 993 |
"replacement": "▁",
|
| 994 |
-
"
|
| 995 |
},
|
| 996 |
{
|
| 997 |
"type": "Split",
|
|
@@ -1104,7 +1104,7 @@
|
|
| 1104 |
{
|
| 1105 |
"type": "Metaspace",
|
| 1106 |
"replacement": "▁",
|
| 1107 |
-
"
|
| 1108 |
},
|
| 1109 |
{
|
| 1110 |
"type": "Strip",
|
|
|
|
| 991 |
{
|
| 992 |
"type": "Metaspace",
|
| 993 |
"replacement": "▁",
|
| 994 |
+
"prepend_scheme": "always"
|
| 995 |
},
|
| 996 |
{
|
| 997 |
"type": "Split",
|
|
|
|
| 1104 |
{
|
| 1105 |
"type": "Metaspace",
|
| 1106 |
"replacement": "▁",
|
| 1107 |
+
"prepend_scheme": "never"
|
| 1108 |
},
|
| 1109 |
{
|
| 1110 |
"type": "Strip",
|