STAY TUNED

Human subtitling or automatic subtitling?

Published: 5 Dec, 2022

The allure of automatic subtitles is easy to understand. You can get a functional translation for your audiovisual content easily, speedily and cheaply.

The popularity of subtitles among viewers is growing: more than 80 % of Netflix users put on subtitles at least once a month, and most younger generation viewers regularly consume subtitles on streaming and social media platforms. It makes perfect sense to want to meet this demand and to do so in the most budget-friendly way, and that’s where automatic subtitling comes in.

However, it is important to be mindful of the big drawbacks of trusting automatic subtitling. In this post we’ve set out to describe to you the problems these tools may pose, and why human translation services are a much better choice if if you want to subtitle any audiovisual content with top-tier quality.

1. Localisation Accuracy

Translation isn’t just a matter of replacing a word by its counterpart in another language. There needs to be a deep understanding of the meaning behind every sentence in the source language so as to transfer it to the target language. What does this mean? Basically, that cultural conventions don’t have a perfect equivalent between different cultures, and thus, there is a context to take into consideration.

Don’t expect an automatic subtitle generator to meet all these complex criteria. It sure will try, and it’s true that technology is constantly advancing, but it still has a long way to go. YouTube’s automatic subtitles provide about 60-70 % accuracy rate, meaning 1 out of every 3 translated words could be wrong. And that’s the largest video platform in the world, which uses Google’s speech recognition resources.

If Artificial Intelligence (AI) systems make errors when transcribing spoken language, how accurate can they be in translating incorrect words?

A real person with native fluency in the language requested to be translated or subtitling is definitely more reliable. Not only will they gain a basic understanding of the text, but they will also make use of their cultural knowledge to correctly convey the message of the original piece. In a nutshell, they will strive for natural style. Take for example the lexical ambiguity of the word ‘pasta’ (colloq. money) said by a Spanish character; an automatic tool would subtitle it just like that, whereas a human subtitler would know how to localise it more accurately by using the word ”dosh” for a British audience.

2. Nuances in Literal Translation

Short films, movies or documentaries are, in essence, artistic expressions. The way a character says their line is as important as the line itself. Subtitles have to capture the nuance inherent to dialogues and performances to successfully convey the stories they tell.

Automatic subtitles aren’t even close to the level of sophistication needed to contextualise speech and represent the scene the actors are actually portraying.

Real translators, however, know how to do it: they take into account the intonation, pauses, facial expressions, the performances, the relationship among the characters that take part in the scene, etc. Bearing all of the above in mind, a translator is able to faithfully and accurately subtitle the work. Let’s imagine a wordplay or a reference that can only be recognised in a particular region; a translator’s hardest job in such cases is getting the audience to understand those jokes and laugh at them. Automatic subtitle generators can only translate word for word, meaning all the humour and charm of the original text will be lost.

3. Multi-Language Subtitling

Subtitling films or series where characters speak several languages requires a much greater effort than those which only need subtitles in one language.

Running a script through a software and expecting it to properly comprehend every language is, in this day and age, an unattainable utopia. Translation softwares need language blocks for every language to be used. And even then, automatic translation tools can’t recognise a context in which more than one language is being spoken. Let’s take as an example a situation in which two people are experimenting a language barrier; automatic subtitles would make it seem like they are communicating with each other swimmingly.

Moreover, subtitling dialogues in certain languages or not doing it is also a conscious decision. If a French character is in Berlin and can’t understand what a bunch of Germans are saying, that dialogue in German wouldn’t be subtitled in order to portray the character’s confusion, even if some other minutes in German are subtitled. A human can make this decision, knowing that the scene’s intention is to depict that state of confusion. Automatic subtitles would make no such distinction and translate all spoken German, missing the point in the scene.

4. Errors in Morphology, Grammar and Punctuation

One of the telltales is when gender is masculine by default. If the original audio is in English, a language where adjectives are not gendered, automatic subtitles might end up having a feminine character refer to herself in the masculine.

Evil Tapes (2022) | Screenshot from Prime Video.

Another big drawback of automatic subtitle-making tools is their inability to determine the right verb tense in sentences where little context is given, which can go as far as mistaking an imperative for an infinitive.

Evil Tapes (2022) | Screenshot from Prime Video.

What’s more, who hasn’t come across forgotten vocative commas or randomly placed capitals?

We’re going to eat Grandma. [Ew!]

We’re going to eat, grandma. [That’s better]

Errors like these are clear evidence that those subtitles haven’t been handled by human hands and that no one has bothered to proofread them, inevitably ruining the work’s image.

5. Audiovisual Translation Norms

Bear in mind that the subtitling process hinges on a whole string of syntactic, orthotypographic and aesthetic considerations that only a professional trained in audiovisual translation can comply with.

The length of a subtitle is an important aspect to consider. It has to match timewise the lines spoken by the characters on screen while providing the audience with the minimum time needed to process the CPS (amount of characters per second). Sometimes, this takes an ‘art’ of synthesis in order to convey in writing the same thing that’s heard, but in record time. An automatic tool will include every word spoken, even if the subtitles get jumbled around.

It is also important to analyse the clauses of a sentence correctly when breaking a subtitle up into lines, and not to mistakenly ‘break’ syntagms, conjunctions and linguistic units. Aside from understanding the standards of line breaking, this requires a profound knowledge of the syntax of the target language, something automatic tools lack.

6. Subtitles for the Deaf and Hard-of-Hearing (SDH)

Besides being essential for accessibility, descriptive subtitles provide key information about the sound effects of an audiovisual work. They help people who are hearing-impaired, while also enhancing the viewing experience by proving the ability to fully appreciate the sound of a video even if the listening conditions aren’t optimal. Everyone wins with subtitles for the deaf.

Despite being able to identify some sound effects, automatically-generated descriptive subtitles will disregard relevant information. A person can specify much better whether a creak is coming from a chair or a door, or whether a set of footsteps are being loud right at an epic moment or stealthy in a horror scene.

Silence can be used to create a dramatic effect on the audience, and descriptive subtitles have to be able to convey this nuance. For instance, if an enthusiastic audience suddenly stops clapping due to what a character is doing on stage and this isn’t shown on screen, it must be pointed out using descriptive subtitles. The same applies when characters move their lips but no sound is coming out of them or their voice sounds muffled. A real translator knows how to interpret and describe those sounds, something that, once again, automation can’t pull off.

And of course, the most fundamental thing: compliance with UNE standards, something only a qualified team can pull off.

Conclusions

The art of audiovisual storytelling is a human talent. Artificial intelligence can’t as of yet fully grasp the nuances of subtitles in a medium that goes beyond logic, data sets or algorithms. It falls short of the skills and cultural connections of a flesh-and-blood translator necessary to subtitle all kinds of content.

We can guarantee that a human touch to your works will reach your audience. Our team of multilingual audiovisual translators have been helping our clients tell the stories they want to tell for years, delivering subtitle translation and subtitling services with the utmost care. Get in touch with us and start today.

I’m late to this post and my subtitles are ready to go, what should I do?

Spray balm and do not manic. If you already have subtitles for your project but aren’t quite sure if they’re up to standard, we can help you. We’ll proofread your subtitles in detail and correct any sort of mistake that doesn’t meet subtitling guidelines, whether it is a language issue or a time-coding one. Before you pluck up the courage and submit your work to that international festival or upload it to an online video platform, make sure everything is in order. Ask for a quote for our professional proofreading.