this post was submitted on 17 Aug 2023
12 points (100.0% liked)

Open Source

823 readers
10 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

hi as above, tx 4 reading.

top 7 comments
sorted by: hot top controversial new old
[–] duncesplayed@lemmy.one 9 points 1 year ago* (last edited 1 year ago) (2 children)

I don't think you'd want that website. Whisper is fairly efficient (even an old GTX can do pretty well at 4x-8x real-time speed), but a website like that would still require pretty expensive cloud GPUs. It's really not possible to imagine that a website like that would not be data mining you and selling all your audio to advertisers to pay off investors.

Better to buy a GPU and do it yourself. (Good news: it takes like 30 seconds to install)

[–] worfamerryman 4 points 1 year ago

I run it on fairly short audio files using a i5-3470.

It’s faster than real time I’m pretty sure. But I rarely use it. Adobe has their own thing that’s in beta and it is a lot better.

It’s part of their podcast platform.

[–] user@lemmy.one 2 points 1 year ago

tx 4 comment. had no idea gpu processing existed. I found this https://github.com/Const-me/Whisper and works perfectly. hopefully nothing nasty on my win pc 🤞tx 👍

[–] Unskilled5117@feddit.de 3 points 1 year ago* (last edited 1 year ago)

For iOS and MacOS there is Hello Transcribe, a privacy respecting app which uses Whisper

[–] Paol@lemmy.ml 1 points 1 year ago (1 children)

Google Colab can run Whisper, but it can most likely also run on an old, cheap laptop that you have.

[–] zoe@lemm.ee 1 points 1 year ago

second this. use collab. but add '!' to their commands since collab runs shell. also use T4 as ur runtime to support floating point. gl. running whisper locally is way slow, especially without a dedicated gpu.

[–] ipsirc@lemmy.ml 1 points 1 year ago