Anyone Can Clone Your Voice Now

26 pointsposted 6 hours ago
by bakigul

10 Comments

tefkah

4 hours ago

I struggle to find non-evil applications of voice-cloning. Maybe listening to your dead relative's voice one more time? But those use-cases seems so niche to the overwhelming use this will likely have: misinformation, scamming, putting voice actors out of work.

apwheele

3 hours ago

I would clone my own and do things like create scripted tutorials/presentations and audio books.

I do not personally prefer it, but a non-trivial number of individuals like video/audio presentations over writing.

testing22321

25 minutes ago

I’m currently recording my books into audiobooks the old fashioned way. I wonder good indistinguishable this would be.

c0balt

3 hours ago

Selling a voice profile for procedural/generated voice acting (similar to elevenlabs "voices") of a well-known person or pleasant sounding voice could be a legitimate use-case. But only iif actual consent is acquired first.

Given that rights about ones likeness (Personality rights) are somewhat defined there might be a legitimate usecase here. For example, a user might prefer a TTS with the voice of a familiar presenter from TV over a generic voice.

But it sounds exceedingly easy to abuse (similar to other generative AI applications) in order to exploit end-users (social engineering) and voice "providers" (exploitation of personality rights).

schlupfknoten

4 hours ago

Voice acting for procedurally generated games?

chistev

4 hours ago

Black mirror episode

pogue

5 hours ago

They couldn't already do that? Or is this new Qwen model just that much significantly better?

tefkah

4 hours ago

It is significantly better.

pogue

4 hours ago

Is there a demonstration or a free way to try it without having to pay for a single test or anything out there?