VoiceBox, Part of Wolfestone Group

Sophie Muller on BBC Radio

Sophie Muller, one of our multimedia specialists here at VoiceBox, appeared as the expert on BBC Radio CWR’s lunchtime show, discussing all things subtitles – from the skill of stenographers to the accessibility of AI- versus human-powered captions.

According to recent reports, around 18% of the UK population currently use subtitles, including the deaf and hard-of-hearing and an increasing number of younger audiences. Sophie advocates that the people who rely on subtitles, deserve the highest amounts of accuracy and quality possible.

You can listen to the subtitle discussion with Brody Swain on BBC Sounds.

If you are unable to listen, no worries! We have transcribed Voicebox’s contribution below:

Brody Swain: Sophie Muller, a multimedia specialist at VoiceBox, a leading subtitles agency. 

Sophie joins me on the show today. Hello Sophie, you okay?

Sophie Muller: Hi Brody! How are you doing?

B: Yeah, very good! So, I’m going to cut to the chase, how do they (subtitles) work?

Because it’s been suggested that there’s somebody sat there tapping away, or that it is artificial intelligence – Sophie, tell me more?

S: It is a mixture of both! AI is becoming more and more introduced as we’re moving into a more technical world; but there are also human captioners as well. People like me who sit behind the screen and tap away, that’s very normal.

I think it depends on your content and what you’re using it for.

We use a human being to capture accuracy. Human beings pick up accents, they pick up colloquialisms, they pick up localised phrases;  AI can struggle with this.

For example, an accent from Glasgow, AI can struggle with that. Human beings have a little bit more time and, you know, the human mind is a little bit different – (it is) more prepared for things like that. We can’t quite get computers up to speed, but AI is used.

But is it accessible for people relying on subtitles? No, it isn’t, but it is sometimes a last-minute choice, that is often cheaper. (Again), it is less accessible and not the friendliest for people who rely on subtitles. Sadly, some companies sometimes tick off a box and use AI, but it is not the most beneficial for the (wider) audience.

B: I’ve witnessed it because I watch a lot of stuff on YouTube and notice that oh, that’s not right, or that’s not worked out well.

So, I know you said there are actual people doing (subtitles), are we talking about people who are tapping that in as a conversation is happening live?

S: Yes, live captioning is done, we do it. Since covid, we have done a lot of live captioning for conferences, webinars, live events, AGMs. You would have seen it in sports at the end of football matches, for example.

Live captioning is someone typing – we use a stenographer using a short-hand keyboard, and that is all done live.

For the stuff you see on Netflix, we get the footage beforehand and use human beings. And, as you would have seen on Squid Game, (there is) subtitle translation. Which is not just for deaf and hard-of-hearing people but reaching a wider audience.

B: Wow! What a skill and a half to use that technology. There’s a certain keyboard to keep up – is that something that takes a long time to train in?

S: Yes, stenography certainly is that. It’s a skill that is sadly dying out.  Once upon a time, they would be in courtrooms tapping away, now we use the same set of skills, but we use it for (live) television.

It is a long time to train, I think that the industry is catching up with it. Some people don’t want the skills when they can just get a computer to do it, but you do see a dramatic reduction in accuracy.

There are a lot of people who rely on subtitles, not just the deaf and hard of hearing, but non-native English speakers and people who need the tv on low. For example, breastfeeding women or men with sleeping babies on them.

They really do need to have access to the television, to the news, or to podcasts. It is hugely beneficial to a wide audience, and we saw a massive boost of children using them during covid for reading,

The people who do rely on (subtitles) need people to do a good job and I think that’s an important message.

B: Subtitles are on in the studio; we’ve got the news channels on and, of course, I can’t have the newsreaders banging away in the background talking away there.

Going back to the pandemic then, would you say it’s really highlighted subtitles?

S: I think it’s done a few things because we’ve been pushed to work from home. (subtitles) have really helped people to join meetings; and for people who might not necessarily be able to lip-read because suddenly we’ve got people in masks giving interviews, we have people who now rely on subtitles.

It is also really good for the fact that people who haven’t been able to get to work for accessibility reasons, can now work from home because we have much more use of subtitles. They can now join effectively into the workplace, where they may have struggled beforehand.

I think it’s really benefitted everybody to be quite honest with you.

B: I think you’re right and you have done such a great job as all our experts do when they come on the show!

So, Sophie Muller, a multimedia specialist at VoiceBox, a leading subtitles agency, answers the question today: how do subtitles work?

I think she answered it, what a great example, there is a combination of the two isn’t there, AI and somebody sat there, completely skilful, who can do it live. Thank you very much indeed.

 

If you are interested in widening your accessibility with subtitles and want to chat this over with us, we’d love to hear from you! Contact us here.