whisper-ro
This model is a fine-tuned version of openai/whisper-small on the Echo dataset, a large open-source Romanian dataset.
| Name | Small | Large-v2 | Fine-tuned small (this model) |
|---|---|---|---|
| Common Voice | 33.2 | 15.8 | 12.2 |
| FLEURS | 29.8 | 14.4 | 10.9 |
| VoxPopuli | 28.6 | 14.4 | 9.4 |
| Echo | >100 | >100 | 8.6 |
| RSC | 38.6 | 28.5 | 5.4 |
Training hyperparameters
The following hyperparameters were used during training:
learning_rate: 1e-05train_batch_size: 128eval_batch_size: 128seed: 42distributed_type: multi-GPUnum_devices: 2total_train_batch_size: 256total_eval_batch_size: 256optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08lr_scheduler_type: linearlr_scheduler_warmup_steps: 500num_epochs: 20.0mixed_precision_training: Native AMP
- Downloads last month
- 14
Model tree for readerbench/whisper-ro
Dataset used to train readerbench/whisper-ro
Evaluation results
- WER on Echoself-reported0.087