Instructions to use rmd26/Coder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Adapters
How to use rmd26/Coder with Adapters:
from adapters import AutoAdapterModel model = AutoAdapterModel.from_pretrained("undefined") model.load_adapter("rmd26/Coder", set_active=True) - Notebooks
- Google Colab
- Kaggle
metadata
license: apache-2.0
datasets:
- fka/awesome-chatgpt-prompts
- bingbangboom/fka-awesome-chatgpt-prompts-hindi
- HuggingFaceFW/fineweb-2
- hackaprompt/Pliny_HackAPrompt_Dataset
- nvidia/OpenScience
- nvidia/AceReason-1.1-SFT
language:
- en
- sk
- cs
- hu
- pl
- de
- es
metrics:
- bertscore
- character
- code_eval
- accuracy
- bleu
- cer
- charcut_mt
- chrf
- bleurt
base_model:
- tencent/Hunyuan-A13B-Instruct
- google/gemma-3n-E4B-it
- google/gemma-3n-E4B-it-litert-preview
- black-forest-labs/FLUX.1-Kontext-dev
- google/magenta-realtime
- Menlo/Jan-nano
- nanonets/Nanonets-OCR-s
- THUDM/GLM-4.1V-9B-Thinking
- tencent/Hunyuan3D-2.1
- moonshotai/Kimi-K2-Instruct
new_version: black-forest-labs/FLUX.1-Kontext-dev
library_name: adapter-transformers