Discover the power of subtitles and elevate your content. Our blog explores the latest techniques for creating accurate and engaging subtitles.
Lately, subtitles are a must… and not only for people who are not familiar with languages.
Have you ever played, and paused, rewinded, played, paused, rewinded, pause…gave up? Subtitles were born to provide translation. But today, even when watching movies, series and videos in our own language…we seem to need them. Why?
A brief history of subtitles
The first known use of intertitles, or printed dialogue and translations, dates back to the early days of silent films. A method that was not ideal. It required the audience to constantly shift their attention between the action on screen and the intertitles. The horror film Jekyll and Mr Hyde is a great example.
Once the silent films began to be replaced by “talkies” or films with synchronized sound, there was no more need for intertitles. The first talkie, The Jazz Singer, was released in 1927. Of course, audiences in non-English speaking countries would need some form of translation to understand the dialogue. That’s how what we know as subtitles was born.
It wasn’t the industry’s first go-to. Initially, the solution was to have voice-over translations in the target language. But due to the available technology, it often ended in voices interfering with the original soundtrack and dialogue. Thus, the first subtitles were invented as a way to provide translations directly on the screen, without interrupting the flow of the film.
The first use of subtitles is usually attributed to the 1931 film City Lights, directed by Charlie Chaplin. The film featured a sequence in which a character speaks in French, and his dialogue was subtitled in English for American audiences.
Since then, subtitles have become a common tool for providing translations and accessibility in films and television shows. And not only to translate what characters say, but what they think, the sounds around them. Anything that enhances the viewing experience.
Why are subtitles important?
They were a need and a solution that couldn’t be avoided. But we choose it today. Subtitles are important for several reasons:
Accessibility: Subtitles make audiovisual content accessible to people who are deaf or hard of hearing. They provide a written transcription of the spoken dialogue and other audio elements. Everyone should be able to enjoy audiovisual content.
Language learning: Subtitles can also be useful for people who are learning a new language. They can help them understand spoken language and learn new vocabulary.
Clarity: Even for people who are fluent in the language being spoken, subtitles can help clarify dialogue that may be difficult to understand due to accents, background noise, or other factors.
Convenience: Subtitles allow viewers to watch content in environments where audio may not be possible or desirable. Imagine you’re making time in a noisy public place. Or in a quiet space where others may be sleeping. Subtitles are convenient for everyone.
So, how come we need subtitles in our own language?
Clarity and convenience are two very strong arguments. Yes. When the audio quality is poor, when the dialogue is spoken in a regional accent or dialect that is difficult to understand, subtitles are great. No matter the language.
But there are some technical reasons why this happens too. Vox did a video on why we all need subtitles. With specialists, surveys and a brief but thorough historical analysis. They did this because in a survey they published, they found out 57% of people used subtitles and the reason was that they had no idea what was going on most of the time.
This has to do with how microphones used to work and how they work now.
The first “talkie” movie, The Jazz Singer, used a single microphone to capture the sound. In the early days of film, sound was recorded separately from the picture using large, unwieldy recording equipment. Actors would often have to speak loudly and directly into the microphone, which was often visible on screen. People had to speak clearly.
Technology and accessibility
In the 1930s, studios developed more sophisticated microphone setups, including directional microphones that could pick up sound from specific areas on set. Still, people had to be positioned very strategically and the sound was directed to a certain place in space. But it allowed actors to move more naturally on set and improved the overall quality of the sound.
Until later the magnetic tape allowed for more precise editing and mixing of sound.
We won’t go too deep into this. But stereo sound allowed more immersive soundscapes. Lightweight microphones allowed for more flexibility in on-location filming. Dolby noise reduction improved the quality of recorded sound and made it possible to capture quieter sounds. Until you reach the present day, where 3D audio experience, and immersive audio are a possibility.
Yet…we need subtitles. Even with all this technology. Why? Well…Because of the immersive experience, some directors choose for everything to sound natural. Realistic. An actor can whisper, scream and babble. All this makes it very real. But not always easy to grasp. And these are artistic decisions. Trends. Styles. So better sound doesn’t always translate to better understanding.
If you sum it all up: technology, accessibility, convenience, clarity, then subtitles become a must. In online video ads, they lead to increased engagement and recall of brand messages among viewers. Subtitles improve viewer comprehension and information retention. They lead to more effective advertising. And they make everyone, no matter where they are, their disability or context, a part of the content.
If you need a hand putting your audiovisual content into words, don’t hesitate in contacting our language experts.