README.md exists but content is empty.
- Downloads last month
- 49
Hardware compatibility
Log In to add your hardware
Quantized
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for Mediform/canary-1b-v2-mlx-q8
Base model
nvidia/canary-1b-v2