• RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    I recently watched Last Samurai Standing from Netflix I think, which I actually thought was pretty good, but I noticed the English dub seemed to use some kind of AI to make the character mouth movements match the English dub words.

    At first it wasnt too noticeable, and I thought they had maybe filmed in English. But there was a lot of things that seemed like maybe they had some native Japanese people on the team because English speakers would have either not known about it or would feel culturally compelled to change it. Then at some point I noticed it, the mouths moving unnaturally. I think it was a close up of someone talking, but once I noticed it, it was pretty distracting.

    I honestly like the idea though. It might actually be a legitimate use case for “AI,” as some people are distracted when character mouth movements don’t match the words they say, but in this case I think it is still too early for deployment. It was done well enough to pass on people that can’t see very well, but it was pretty distracting for me once I noticed it. It would be too much work to try and film all the actors saying languages they likely don’t understand to try and composite. Is it mandatory? No, but it would be nice for people that get distracted by mismatched audio.

  • BoastfulDaedra@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    11 hours ago

    Honestly, the AI-driven subtitles are intolerable enough.

    The sooner this AI bubble crashes and burns the better. If this trillion-dollar corporation can’t keep up with the expected reputation… then it needs to stop claiming that’s it’s worth that much.

  • saejima@ani.social
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    18 hours ago

    So if I understand correctly, they fire professional voice actors and replace them with grotesque AI voice bots. I hope there will be more fandubbing in the future

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 hours ago

      Yeah, maybe with some kind of filter so a few people can sound like a whole cast of distinct characters, without necessarily mimicking any particular human being.

      … wait.

  • Mercuri@ani.social
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    3
    ·
    2 days ago

    The argument of “just watch subs” falls flat when they’re using AI to do soulless subtitling too.

    Jesus fuck I hate AI so much. The other day Google AI literally gaslit me. I remembered a joke I had heard in a movie but couldn’t remember the movie title. Google used to be really good at this kinda thing so I Googled it. The results were wrong and the AI was like “It’s Mike and Dave Need Wedding Dates”.

    Now, I’ve never seen Mike and Dave Need Wedding Dates so I knew that couldn’t be the movie so I told it “no that’s not it”. And what did the AI do? Did it say “oh, well then maybe it’s this?” Did it say it didn’t know? No, it doubled down insisting it MUST be Mike and Dave Need Wedding Dates.

    I COMMANDED it to stop suggesting it was Mike and Dave because it was definitely not it and it STILL kept saying that was the movie.

    So I asked ChatGPT. Yeah, I know it’s another AI but I was at my wits end here and Google was undeniably wrong. Anyway, ChatGPT figured it out instantly.

    So I went back to Google and was like, “ChatGPT figured it out. It was ‘The Art of Racing in the Rain’ you stupid AI.”

    Rather than admit it was wrong, Google then gaslit me saying I was misremembering the movie and that it was Mike and Dave Nerd Wedding Dates.

    I was furious so I told Google to prove it to me. Then it got all vague like, “well, I searched my database and web records.” I asked for the specific records. It gives me a link to the movie script for Mike and Dave Need Wedding Dates on some website. I do a simple word search and do not find the joke anywhere in the script. Nothing even remotely close.

    When I point this out to Google it was like" well, maybe it was adlibbed or not in that early version if the script, but it’s in discussions of the movie so I know I’m right."

    “Show me these discussions”

    “Sorry, there are no discussions”

    “So you hallucinated the whole thing and completely made it up?”

    “Yes”

    Fuck you, Google.

    • HubertManne@piefed.social
      link
      fedilink
      English
      arrow-up
      2
      ·
      21 hours ago

      While not quite as long a thing I have had the ai do this. Maybe you are misremembering and its just that it could not find it.

    • HarkMahlberg@kbin.earth
      link
      fedilink
      arrow-up
      18
      ·
      2 days ago

      You know 15 years ago it was common pastime to see how quickly we could trick Cleverbot into admitting it was a chat bot and not actually intelligent. Now we have chat bots competing with us to prove that we must not be intelligent.

      And they’re winning.

    • Die4Ever@retrolemmy.com
      link
      fedilink
      English
      arrow-up
      26
      ·
      2 days ago

      So I went back to Google and was like, “ChatGPT figured it out. It was ‘The Art of Racing in the Rain’ you stupid AI.”

      lmao

  • molave@reddthat.com
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 days ago

    On one hand: It’s impossible to dub every single anime at a reasonable time.

    On the other: I totally see the big companies refusing to hire real actors for dubbing because ‘muh profits’

      • molave@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        1 day ago

        There are ~25K anime in MyAnimeList. If we assume that there are 250K episodes with a total average length of 25 minutes, each, we are talking about dubbing 6.25 million minutes overall.

        Let’s say 80% of them is not yet dubbed, that’ll be 5M minutes. Netflix delayed KaoruHana for at least two months to dub 8 episodes * 25 minutes, so they likely took three months to complete 200 minutes of footage i.e. a rate of 0.3 anime minutes/VA minute. Let’s say there are 100 organizations with three teams of dubbers each, so the rate can be expanded to 90 anime minutes/VA minute overall.

        Clearing the backlog will require 55K minutes. That’s 925 hours/115 business days, or almost a year. I guess it is possible if we wait this long. And this calculation ignores time to select the right VA for the job, paying the amount the VAs request to accept the job, plus a whole lot of other factors I failed to mention or I don’t even know exist.

        And this is for just one language (English), and I’m still waiting for the Italian dub of JoJo Part 5.

        • _‌_反いじめ戦隊@ani.social
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          1 day ago

          If we assume

          bruv every anime listed in MAL has runtimes, lengths, and additional metadata. Your overestimation is simply wrong, with the fact there are few N1 localizers willing to work at the 硬貨 on the دينار for whatever AMZN is valued at. If you don’t pay what we are worth, continue waiting on Italians to localize Jojo for you at N5 levels of knowledge.

          • KairuByte@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 day ago

            How is “every anime has one season of 10 episodes running 25 minutes each” an overestimation? I’d call that an underestimation. Yeah, there are anime with much shorter runtimes, but there are also many anime with well over 100 episodes.

            Not to mention, are you really expecting this random person to write an interface for their API to get exact numbers for an off the cuff online discussion?

            • _‌_反いじめ戦隊@ani.social
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 day ago

              Because dub time doesn’t equate to run time. Most anime has PLENTY of no voice work, and openings and endings can be copy and pasted.

              are you really expecting this random person to write an interface for their API to get exact numbers for an off the cuff online discussion?

              I fear more you are not aware where in the internet you are. This is basic 101 scripting work that happens every second your application notifies you your new anime got updated. Basic scripting is what our /c/ and threadiverse do for most of us who actually moved on from fashit.

          • molave@reddthat.com
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            1 day ago

            I prefer proper VA dubbing versus AI dubbing. This is more of a question if the viewership prefers AI dub vs. no dub if proper dubs are not yet available.

            bruv every anime listed in MAL has runtimes, lengths, and additional metadata.

            Fair, I don’t have the time right now to be 100% accurate at my figures, so I went with a rough estimate. I tried to be as clear as I can on that point.

            there are few N1 localizers willing to work at the 硬貨 on the دينار for whatever AMZN is valued at

            If it’s only easy for fandubs to be readily available. Like there are few N1 localizers willing to work at Amazon’s assessed rates, few IP holders are willing to say yes to dubbing their shows at the money fandubbers can afford. There’s also hiring the proper voice actors for the characters, and the ones who do the anime justice/will not have the fans crucifying the anime for doing a craptastic job at dubbing deservedly ask a premium.

            • _‌_反いじめ戦隊@ani.social
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 day ago

              viewership prefers AI dub vs. no dub

              According to the beta testers, and the Internet, listeners abhorred the LLM localization & actual tone-deaf Speech audio dubbing. Keeping the original dubbings is simply what folks want, esp. if it’s labeled abridged.

              [components of dubbing]

              At the least you are aware why this /c/ prefers subs, because it is that much cheaper and errorless to output.

              • molave@reddthat.com
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 day ago

                According to the beta testers, and the Internet, listeners abhorred the LLM localization & actual tone-deaf Speech audio dubbing. Keeping the original dubbings is simply what folks want, esp. if it’s labeled abridged.

                Yes, at its current state. Will it stay that way? The tech companies are burning cash in attempts to make it not so. My hunch says even Vocaloid-tier AI dubbing will be enough for a large sector of the audience. Then the human vs. AI dubbing debate could be analogous to debates between lossy (more accessible) vs. lossless (higher quality) audio.

                Now, LLM localization is the greater challenge. I highly doubt those, including the classic machine-learning models, can reach N1-level localization quality.

                • Unboxious@ani.social
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 day ago

                  Now, LLM localization is the greater challenge. I highly doubt those, including the classic machine-learning models, can reach N1-level localization quality.

                  There’s no chance it’s happening any time soon. Many manga and anime lean heavily on visual context as well as the context of the story in general to clear up situations where the language would otherwise be ambiguous, so until the translation software can also use all of that context it’s basically impossible.

                • Susaga@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 day ago

                  It’s amusing to me how long people have been saying “yes, AI is crap, but it might not be crap some day, so just you wait!” Despite all the money tech companies have thrown at AI, it’s still as crap as it ever was, and I don’t see any reason to think it’ll get better.

                  Meanwhile, Crunchyroll doesn’t care if it’s crap, so long as they can get around the cost of paying humans (which is another can of worms). If they’re willing to buy this level of quality, what incentive is there for quality to improve?

                • _‌_反いじめ戦隊@ani.social
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 day ago

                  The only thing funny about mentioning Vocaloid is the fact that Vocaloid synthesis has to be manually pitched, tempod, and toned🤣. Glad you honestly believe capitalists want to invest more on disqualifying tone deafening pitchless speech waveforms.

                  But please, never stop supporting espeak!

  • krooklochurm@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    2 days ago

    Oh my god I fucking love these dubs.

    It’s so fucking bad.

    I need more garbage like this in my life

    • mindbleach@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      Well good news, it’ll probably run locally by next year.

      I’d love to be able to watch any show from any language on like a one-minute delay, with the robot faking the original voices. But I’d also like a fully in-character version of Revenge Of The Sith with the infamously bad Chinese subtitles.

  • xia@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    4
    ·
    1 day ago

    I can’t seem to watch the example ATM, but I seem to be allergic to subs, so there are MANY shows that I would love to have even trash-quality dubs.

      • xia@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Honestly… given the reaction seen here, I was expecting FAR worse. Assuming there is approximately zero human input/effort, I actually find it remarkable for a push-button auto-dubbing solution. In fact, if I had such a auto-dubbing “button”, I would most surely make heavy use of it, and maybe even pay extra for it as a service… but that might just be me.

          • xia@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            21 hours ago

            I understand that it’s weird, and it sucks that they would lift this above a curated human dubs (I guess they have separate ongoing license fees?), but yeah… as best I can tell it would be watchable (which for me is better than subs-only). Though, I don’t know what it might do to my enjoyment (or brain) after a whole episode/season of ingesting this. :)

            I think where I land on this issue is that this ought to be a third pole. We have subs, dubs, and this… and IMO it is misleading to call it a dub, or mix these in with wittingly-dubbed content, but I certainly see value in it.

            Secondary effects may be another matter… part of me is afraid of a “raise to the bottom” effect, where we find ourselves in a position where nobody pays for high-quality dubs at some point. But in theory, lowering the barrier to dubbing does not necessarily need to bring everything down to trash level; if practically overnight everyone and their dog can auto-dub anime, there is no reason to have sub-only anime, pro-dub becomes a standout feature, and watchful streaming services could even see those autodubs that get high watch counts as a demand signal to pay for professional dubs.