YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Merge https://huggingface.co/upstage/llama-65b-instruct as donor and https://huggingface.co/Gryphe/MythoMax-L2-13b as primarly model by using frankenllama_22b.py from https://huggingface.co/chargoddard/llama2-22b.

It has 32.905b parameters as the result.

Downloads last month
1
Safetensors
Model size
33B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support