Instructions to use BenjaminOcampo/task-implicit_task__model-hatebert__aug_method-ra with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use BenjaminOcampo/task-implicit_task__model-hatebert__aug_method-ra with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="BenjaminOcampo/task-implicit_task__model-hatebert__aug_method-ra")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("BenjaminOcampo/task-implicit_task__model-hatebert__aug_method-ra") model = AutoModelForSequenceClassification.from_pretrained("BenjaminOcampo/task-implicit_task__model-hatebert__aug_method-ra") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 32019b00883efc8236ff2d5a3f90f2816d7618968485993087870a0e58e8ae1f
- Size of remote file:
- 438 MB
- SHA256:
- 212edb138d1834514bfaa4003782a47c74bb8e0a95bf9ce4be96b30fa1967e55
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.