Instructions to use deepset/gbert-base-germandpr-question_encoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use deepset/gbert-base-germandpr-question_encoder with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="deepset/gbert-base-germandpr-question_encoder")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("deepset/gbert-base-germandpr-question_encoder") model = AutoModel.from_pretrained("deepset/gbert-base-germandpr-question_encoder") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md dcbc13b
Update README.md ccfc79c
Update README.md 1302d0a
Update README.md c39858a
Add question encoder model e548b7c
Bogdan Kostić commited on