How can this SmolVLA model be exported as an ONNX model?
#10
by
Tank-123
- opened
I'm trying to convert this SmolVLA to ONNX and hopefully quantize to int8 to run on CPU. Has anyone successfully tried>
Tank-123
changed discussion title from
How can this smolvla model be exported as onnx model ?
to How can this SmolVLA model be exported as an ONNX model?
Tank-123
changed discussion status to
closed
I have problems accessing the Model through the Python file path in the Notebook. Any sugestions on how to solve this issue ?
I mean. The problem was not with the installation. It was with the Finetuning block.