Instructions to use Mattimax/DATA-AI_Chat_4_0.6B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Mattimax/DATA-AI_Chat_4_0.6B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Mattimax/DATA-AI_Chat_4_0.6B") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Mattimax/DATA-AI_Chat_4_0.6B") model = AutoModelForCausalLM.from_pretrained("Mattimax/DATA-AI_Chat_4_0.6B") messages = [ {"role": "user", "content": "Who are you?"}, ] inputs = tokenizer.apply_chat_template( messages, add_generation_prompt=True, tokenize=True, return_dict=True, return_tensors="pt", ).to(model.device) outputs = model.generate(**inputs, max_new_tokens=40) print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:])) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use Mattimax/DATA-AI_Chat_4_0.6B with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "Mattimax/DATA-AI_Chat_4_0.6B" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Mattimax/DATA-AI_Chat_4_0.6B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/Mattimax/DATA-AI_Chat_4_0.6B
- SGLang
How to use Mattimax/DATA-AI_Chat_4_0.6B with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "Mattimax/DATA-AI_Chat_4_0.6B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Mattimax/DATA-AI_Chat_4_0.6B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "Mattimax/DATA-AI_Chat_4_0.6B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Mattimax/DATA-AI_Chat_4_0.6B", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }' - Docker Model Runner
How to use Mattimax/DATA-AI_Chat_4_0.6B with Docker Model Runner:
docker model run hf.co/Mattimax/DATA-AI_Chat_4_0.6B
🚀 DATA-AI_Chat_4_0.6B – Model Card
✅ Modello Maturo – Pronto per Applicazioni Reali Questo modello rappresenta un'evoluzione stabile della serie DATA-AI Chat, con capacità linguistiche più avanzate e migliori prestazioni generali.
✅ Production-Grade Model – Ready for Real Applications This model is a stable evolution of the DATA-AI Chat series, featuring improved language abilities and overall better performance.
🇮🇹 Descrizione del Modello
Mattimax/DATA-AI_Chat_4_0.6B è un modello linguistico multiuso sviluppato da M.INC., progettato per supportare conversazioni, assistenza automatizzata, e prototipi AI in ambienti produttivi leggeri.
Basato su tecnologie raffinate rispetto alla versione Thinking_0.5B, questo modello sfrutta 0.6 miliardi di parametri, ottimizzati tramite tecniche avanzate di feedback umano e istruzione supervisionata.
- Performance: migliorata e più fluida
- Stabilità: elevata
- Consigliato per: applicazioni reali, assistenza AI, automazione leggera
🇬🇧 Model Description
Mattimax/DATA-AI_Chat_4_0.6B is a general-purpose language model developed by M.INC., designed for lightweight AI applications such as conversational agents and automated assistance.
It builds upon lessons from the earlier Thinking_0.5B prototype, scaling up to 0.6 billion parameters and integrating advanced instruction tuning and human-feedback methods.
- Performance: enhanced and smoother
- Stability: high
- Recommended for: real-world usage, AI assistance, light automation
🎯 Obiettivo / Purpose
🇮🇹 Offrire un modello linguistico compatto ma efficace, adatto a scenari reali dove è necessaria una buona comprensione contestuale con risorse limitate. 🇬🇧 Deliver a compact yet effective language model suited for real-world scenarios requiring good contextual understanding with limited resources.
🗣️ Lingue Supportate / Supported Languages
- 🇮🇹 Italiano / Italian
- 🇬🇧 Inglese / English
- 🇪🇸 Spagnolo / Spanish
- 🇫🇷 Francese / French
- ➕ Supporto completo o parziale per oltre 30 lingue / Full or partial support for 30+ other languages
🏗️ Origine / Origin
- Base model: Architettura proprietaria evoluta da
DATA-AI_Chat_4_Thinking_0.5B - Sviluppato da / Developed by: Mattimax (M. Marzorati)
- Organizzazione / Organization: M.INC.
- Progetti correlati / Related Projects: DATANET – piattaforma AI conversazionale di nuova generazione
📌 Stato del Modello / Model Status
| Caratteristica / Feature | Stato / Status |
|---|---|
| Tipo / Type | Stabile / Stable |
| Performance | Elevata / High |
| Stabilità / Stability | Garantita / Guaranteed |
| Utilizzo / Usage | Pronto per la produzione / Production-ready |
| Licenza / License | Apache 2.0 |
📥 Download
📦 Mattimax/DATA-AI_Chat_4_0.6B on Hugging Face
📝 Note Finali / Final Notes
🇮🇹 Con questa versione, la famiglia DATA-AI raggiunge una nuova maturità: un equilibrio tra leggerezza e intelligenza contestuale, pronto per essere integrato in sistemi reali.
🇬🇧 With this version, the DATA-AI family reaches a new maturity: a balance between lightness and contextual intelligence, ready to be integrated into real systems.
- Downloads last month
- 18