Poor deaf kids. Not because they’re being held captive but because they’re relying on shitty automatic captions.
For example, Czech was only added very recently and the captions really suck, they change the meaning of most sentences and even include spelling errors.
Everyone making scripted videos should at least:
go through their script to convert it into a transcript (match what’s actually been said – looking at you CGP Grey – and remove visual cues)
upload it for YouTube’s auto-timing (which is not perfect but we’ll take it)
Too bad the FCC’s captioning act is toothless, even TV stations (like HBO) uploading their content to YouTube don’t bother importing captions even though they’re legally required to.
well, i have no evidence of this. however. looking at the way auto-generated subtitles are served at youtube right now, they are sent individually word-by-word from the server, pick up filler words like “uh”, and sometimes pause for several seconds in the middle of sentences. and they’re not sent by websocket, which means they go through multiple requests over the course of a video. more requests means the server works harder because it can’t just stream the text like it does the video, and the only reason they’d do that other than incompetence (which would surely have been corrected by now, it’s been like this for years) is if the web backend has to wait for the next word to be generated.
i would love to actually know what’s going on if anyone has any insight.
explain. edited with explanation. i’ve seen the technology connections video, thanks.
my comment is still about the actual post above, and i was specifically thinking about auto-generated subs rather than, say, movies. apparently that’s not obvious.
My student friend tells me that the auto-generated captions for non-English MS Teams lecture recordings recently have improved significantly and have even become usable.
Probably because a lot of countries either dub the content or it is already in their native laguage. You generally see a lot of subtitles on OpenSubtitles of countries like The Netherlands where that doesn’t happen
Where do you think money for development of new shit comes from? From places where money is made, like TV, movies etc. YouTube doesn’t really make more money because the ads are there.
it’s crazy to me that for all the ai “advances” in the past few years nobody has thought to improve subtitling.
Poor deaf kids. Not because they’re being held captive but because they’re relying on shitty automatic captions.
For example, Czech was only added very recently and the captions really suck, they change the meaning of most sentences and even include spelling errors.
Everyone making scripted videos should at least:
Too bad the FCC’s captioning act is toothless, even TV stations (like HBO) uploading their content to YouTube don’t bother importing captions even though they’re legally required to.
Andrew Ng did a video when he gradually added noise to the training audio to improve the quality.
But here we are dealing with homophones so it’s not just turning speech to text, it also needs to be context aware.
Possible but too expensive to implement automatically.
context awareness is the entire point of language models tho :(
I’m highlighting that speech to text and context awareness are different skills.
YouTube is unlikely to waste loads of compute power on subtitles that don’t need it just to capture the occasional edge case.
i mean, it’s a one-time-per-video thing. they already do tons of processing on every upload.
So if you can reduce compute there then you save money.
There is no technical difficulty. It’s a business decision.
right now they’re dynamically generating subtitles every time. that’s way more compute.
For real? That’s incredibly dumb/expensive compared to one subtitle roll. Can you share where you saw that?
well, i have no evidence of this. however. looking at the way auto-generated subtitles are served at youtube right now, they are sent individually word-by-word from the server, pick up filler words like “uh”, and sometimes pause for several seconds in the middle of sentences. and they’re not sent by websocket, which means they go through multiple requests over the course of a video. more requests means the server works harder because it can’t just stream the text like it does the video, and the only reason they’d do that other than incompetence (which would surely have been corrected by now, it’s been like this for years) is if the web backend has to wait for the next word to be generated.
i would love to actually know what’s going on if anyone has any insight.
it would be an improvement. thats not what we are doing anymore
new tech is there to make everyone more miserable
It’s even worse for captions.
Captions and subtitles aren’t even the same thing.
In fact, most DVD players don’t even pass the code captioning through HDMI ports, so old captioned DVDs don’t work anymore.
i too watched that technology connections video
explain.edited with explanation. i’ve seen the technology connections video, thanks.my comment is still about the actual post above, and i was specifically thinking about auto-generated subs rather than, say, movies. apparently that’s not obvious.
My student friend tells me that the auto-generated captions for non-English MS Teams lecture recordings recently have improved significantly and have even become usable.
not for my language. they are hilariously bad.
I’m sure they have… they just aren’t currently incentivized to do so
Probably because a lot of countries either dub the content or it is already in their native laguage. You generally see a lot of subtitles on OpenSubtitles of countries like The Netherlands where that doesn’t happen
on youtube and twitter?!
Auto generated subtitles don’t sell ads and don’t aquire personal data.
No, but on the fields where there is money to be made for subtitles, like movies and tv shows.
what does that have to do with the OP though?
Where do you think money for development of new shit comes from? From places where money is made, like TV, movies etc. YouTube doesn’t really make more money because the ads are there.
And Twitter hahaha I can only laugh at that
…i feel like you’re having a different conversation than i am.
That’s what OP say, I just responded with the reason why that wasn’t being done. Aka money
that’s me. op is the big image up top.