Spaces:
Running
Running
readme : add docker instructions (#2711)
Browse filesI found the docker instructions to be useful in the README.md and the differences in docker variants such as ffmpeg and cuda support. However, this section was removed in v1.7.4 and I would vote to bring it back.
This is a pull request to add that section back.
README.md
CHANGED
|
@@ -360,6 +360,38 @@ Run the inference examples as usual, for example:
|
|
| 360 |
- If you have trouble with Ascend NPU device, please create a issue with **[CANN]** prefix/tag.
|
| 361 |
- If you run successfully with your Ascend NPU device, please help update the table `Verified devices`.
|
| 362 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 363 |
## Installing with Conan
|
| 364 |
|
| 365 |
You can install pre-built binaries for whisper.cpp or build it from source using [Conan](https://conan.io/). Use the following command:
|
|
|
|
| 360 |
- If you have trouble with Ascend NPU device, please create a issue with **[CANN]** prefix/tag.
|
| 361 |
- If you run successfully with your Ascend NPU device, please help update the table `Verified devices`.
|
| 362 |
|
| 363 |
+
## Docker
|
| 364 |
+
|
| 365 |
+
### Prerequisites
|
| 366 |
+
|
| 367 |
+
- Docker must be installed and running on your system.
|
| 368 |
+
- Create a folder to store big models & intermediate files (ex. /whisper/models)
|
| 369 |
+
|
| 370 |
+
### Images
|
| 371 |
+
|
| 372 |
+
We have two Docker images available for this project:
|
| 373 |
+
|
| 374 |
+
1. `ghcr.io/ggerganov/whisper.cpp:main`: This image includes the main executable file as well as `curl` and `ffmpeg`. (platforms: `linux/amd64`, `linux/arm64`)
|
| 375 |
+
2. `ghcr.io/ggerganov/whisper.cpp:main-cuda`: Same as `main` but compiled with CUDA support. (platforms: `linux/amd64`)
|
| 376 |
+
|
| 377 |
+
### Usage
|
| 378 |
+
|
| 379 |
+
```shell
|
| 380 |
+
# download model and persist it in a local folder
|
| 381 |
+
docker run -it --rm \
|
| 382 |
+
-v path/to/models:/models \
|
| 383 |
+
whisper.cpp:main "./models/download-ggml-model.sh base /models"
|
| 384 |
+
# transcribe an audio file
|
| 385 |
+
docker run -it --rm \
|
| 386 |
+
-v path/to/models:/models \
|
| 387 |
+
-v path/to/audios:/audios \
|
| 388 |
+
whisper.cpp:main "./main -m /models/ggml-base.bin -f /audios/jfk.wav"
|
| 389 |
+
# transcribe an audio file in samples folder
|
| 390 |
+
docker run -it --rm \
|
| 391 |
+
-v path/to/models:/models \
|
| 392 |
+
whisper.cpp:main "./main -m /models/ggml-base.bin -f ./samples/jfk.wav"
|
| 393 |
+
```
|
| 394 |
+
|
| 395 |
## Installing with Conan
|
| 396 |
|
| 397 |
You can install pre-built binaries for whisper.cpp or build it from source using [Conan](https://conan.io/). Use the following command:
|