Translation & Subtitling Archives - 3Play Media https://www.3playmedia.com/blog/tag/translation-subtitling/ Take Your Video Content Global Fri, 12 Sep 2025 19:29:44 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.3playmedia.com/wp-content/uploads/2025/07/cropped-favicon_1x-300x300-1-32x32.webp Translation & Subtitling Archives - 3Play Media https://www.3playmedia.com/blog/tag/translation-subtitling/ 32 32 Closed Captioning vs. Subtitles: What’s the Difference and Why it Matters for Accessibility (Including EAA) https://www.3playmedia.com/blog/closed-captioning-vs-subtitles/ Fri, 11 Apr 2025 04:00:00 +0000 https://www.3playmedia.com/blog/closed-captioning-vs-subtitles/ • Watch the Webinar: How the EAA Impacts Global Business Captions and subtitles are important timed text solutions that make video content accessible to all audiences. But over the last several years, the two have become clouded with questions and confusion, with the top concern being “What’s the difference between captions and subtitles?” Many experts...

The post Closed Captioning vs. Subtitles: What’s the Difference and Why it Matters for Accessibility (Including EAA) appeared first on 3Play Media.

]]>

  • Captioning

Closed Captioning vs. Subtitles: What’s the Difference and Why it Matters for Accessibility (Including EAA)


Watch the Webinar: How the EAA Impacts Global Business


Captions and subtitles are important timed text solutions that make video content accessible to all audiences. But over the last several years, the two have become clouded with questions and confusion, with the top concern being “What’s the difference between captions and subtitles?”

Many experts have weighed in, slapping labels to “captions” and “subtitles” in order to give each a singular, yet narrow definition. Now, some of these definitions may be correct, but they’re often only partially so. Why? 

Captions and subtitles are a lot more complex than most people realize. While they may seem interchangeable, understanding the differences between captions and subtitles is not only crucial for selecting the most appropriate option to enhance viewer experience and reach, but it also carries significant weight when addressing legal and accessibility requirements. For organizations and content creators serving the European market, this understanding is paramount for ensuring compliance with the European Accessibility Act (EAA).

In this blog, we’re diving head-first into the captions vs. subtitles debate. We’ll define timed text, captions, and subtitles; review the various types of captions and subtitles; and explore why they’ve become such a source of confusion in recent years.

What is a timed text?

Untitled design (1)

A timed text is a text-based file that includes timing information. 

In the accessibility space, timed text files are usually intended to pair the transcription of dialogue and/or sound to media. The timing information allows the text to be synchronized to specific time codes of media. Both captions and subtitles are forms of timed text.

What are captions?

Captions were introduced to accommodate D/deaf and hard of hearing television viewers in the early 1970s. Eventually, captions became a mandated requirement for broadcast television in the United States.

Captions provide a textual transcript of a video’s dialogue, sound effects, and music. Captions are designed for use by D/deaf and hard of hearing audiences, but have gained popularity with all audiences

Screenshot of man and woman talking. Closed caption reads "These are captions."

Standard closed captioning style: white text on a black box.

Captions appear as white text over a black box by default, but can sometimes be customized by viewers, depending on where media is being viewed.  Placement varies, but is often centered at the bottom of the screen for readability. When graphics or text appear in the lower third of the video, captions are typically placed at the top of the screen.

608 Captions

608 closed captions (also known as CEA-608, EIA-608, or Line 21 captions) were the standard captioning type for analog television transmission. 608 captions are unable to be customized by viewers, though they are compatible with digital television.

708 Captions

708 closed captions (also known as CEA-708/EIA-708/CTA-708 captions) are the newer standard captioning type for digital television. 708 captions are customizable by viewers, but are not compatible with analog television.

Styles
Captions have a few main display styles: pop-on, roll-up, and paint-on. Pop-on is used for recorded content. Roll-up is used for live programming. Paint-on is rarer to find in modern captioning workflows, but may occasionally be used in certain types of programming.

What are subtitles?

Subtitles were introduced in the 1930s, when silent film transitioned to “talkies,” or film with spoken audio, in order to accommodate foreign audiences who didn’t understand the language used in a film. 

Subtitles provide a textual translation of a video’s dialogue. Traditionally, subtitles assume the viewer can hear the audio but cannot understand the language. The exception to this is subtitles for the D/deaf and hard of hearing, which assume the viewer cannot hear the audio or understand the language.

Screenshot of man and woman talking. White subtitle reads "These are subtitles."

Common subtitle style: white text with black dropshadow, no background.

Screenshot of man and woman talking. White on semi-transparent black box subtitle reads "These are subtitles."

Subtitles mimicking the appearance of closed captions.

Subtitles can appear in a variety of styles, but often appear as white or yellow text outlined in black, or with a black dropshadow. It is also common for subtitles to mimic the appearance of captions. Placement varies, but is often centered at the bottom of the screen for readability and ease in translation. When graphics or text appear in the lower third of the video, subtitles are typically placed just above the graphic/text. Subtitles can sometimes be customized by viewers, depending on where media is being viewed.

non-SDH

Non-subtitles for the d/Deaf and hard of hearing (non-SDH) are traditionally referred to as just “subtitles.” Non-SDH are designed for viewers who can hear the dialogue and non-dialogue information but cannot understand the language. The only transcribed element of non-SDH is dialogue. On-screen graphics or words may also be transcribed, when time allows for the translation of these elements.

SDH

Subtitles for the D/deaf and hard of hearing (SDH) assume the end user cannot hear the dialogue and include important non-dialogue information such as sound effects, music, and speaker identification.

SDH were originally designed for viewers who cannot understand the language, but are increasingly used in place of captions on some video platforms and services.

Forced Narrative

Forced narrative (FN) subtitles, also known as forced subtitles, clarify pertinent information meant to be understood by the viewer. FN subtitles are overlaid text used to clarify dialogue, burned-in texted graphics, and other information that is not otherwise explained or easily understood by the viewer. 

Open vs. Closed
Both captions and subtitles can be open or closed.

On and off toggle buttons

Open: The captions or subtitles are permanently visible or burned onto the video. The viewer cannot turn them off.

Closed: Captions and subtitles are not visible unless they are turned on. The viewer can toggle the captions or subtitles on and off at their leisure.

Why Do Caption and Subtitle Choices Matter for European Accessibility Act (EAA) Compliance?

 

Learn how 3Play can support you in becoming EAA compliant

 

Why are captions sometimes called subtitles and vice versa?

Captions and subtitles are infamous for being confused with one another, and there’s a few reasons for this. Let’s take a quick look at how global differences in terminology and the increased usage of SDH have been adding chaos to the CC vs. subs discourse.

Global Terminology Differences

Globe with location pins in various places. Words "CC" and "SUB" appear next to pins, depending on location.

Outside of the United States and Canada (for example: the UK, Ireland, and most other countries), video subtitling and captioning are usually considered one and the same. In other words, the use of the term “video subtitling” does not distinguish between subtitles used for foreign language translation, and captioning used to aid the D/deaf and hard-of-hearing audiences.

The globalization of video content across corporate, education, and entertainment industries has greatly impacted how viewers use the terms “captions” and “subtitles”. It can be hard for viewers to understand the difference between the two when different entities label their accessible timed text files based on regional preferences. 

SDH = CC…for some

Because of the aforementioned globalization of video content, closed captions and subtitles for the D/deaf and hard of hearing are now commonly mistaken for one another. It’s easy to see why: they both serve D/deaf and hard of hearing audiences and often look alike.

But SDH and captions are different. SDH were initially designed to accommodate D/deaf and hard of hearing audiences who could not understand the language. But over the past few years, SDH have been used in place of captions on platforms where traditional captions are not supported. Sometimes the platform will refer to SDH as “SDH”; other times, they may be called “CC”. There are even cases where they could be called both, e.g. “CC/SDH”.

Captions vs. Subtitles

Because of the many nuances involved in defining captions and subtitles, it’s hard to compare both in general terms. To get to the heart of the individual differences between them, it’s important to break captions and subtitles down into their individual types.

Feature Captions Subtitles
608 708 SDH non-SDH FN
Text transcribed All All All Dialogue only Only pertinent dialogue & information not easily understood by viewer
Timed text synced to video
Audience assumption D/deaf and hard of hearing D/deaf and hard of hearing D/deaf and hard of hearing Hearing Hearing
Can be turned on/off
In source language Sometimes Sometimes
Speaker identification    
Music & sound effects    
Signs & graphics transcribed       Sometimes
Translation options Limited Limited
Appearance White text on black box; 32 characters per line White text on black box; 32 characters per line Varies; 42 characters per line Varies; 42 characters per line Varies; 42 characters per line
Placement Varies–usually centered at bottom, moving to top for lower third graphics Varies–usually centered at bottom, moving to top for lower third graphics Varies–usually centered at bottom, moving to top or just above lower third graphics Varies, usually centered at bottom, moving to just above lower third graphics Varies, usually centered at bottom, moving to just above lower third graphics
User Customization (when available)  

There’s a lot of nuance missing from the captions vs. subtitles discourse, and the complexities of each won’t go away anytime soon. In the broadest sense, each serves a different purpose with a common goal:

  • Captions provide an accessible way for viewers who cannot hear audio to watch video.
  • Subtitles provide an accessible way for speakers of any language to watch video.

Video accessibility is the string that ties captions and subtitles together, but there are ways to move beyond generalization of these accessibility solutions. The question of “what’s the difference between captions vs. subtitles?” is one that will always require us to break it down further. By comparing and contrasting the individual types of captions and subtitles, we can begin to grasp the differences between the two a lot more easily. 

 

EAA Get Started

 

This blog post was originally published by Sofia Leiva on August 14, 2016, and was updated on June 22, 2021 by Kelly Mahoney. It has since been updated again for comprehensiveness, clarity, and accuracy.


About the author

The post Closed Captioning vs. Subtitles: What’s the Difference and Why it Matters for Accessibility (Including EAA) appeared first on 3Play Media.

]]>
The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them https://www.3playmedia.com/blog/the-ultimate-guide-to-subtitles-different-types-how-they-work-and-when-to-use-them/ Thu, 22 Jun 2023 19:30:41 +0000 https://www.3playmedia.com/blog/the-ultimate-guide-to-subtitles-different-types-how-they-work-and-when-to-use-them/ The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them Video subtitling is instrumental in reaching global audiences, but can be a complex and nuanced media accessibility solution. Add captions to the equation, and it can become even more confusing for producers and creators of video content. We know it’s...

The post The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them appeared first on 3Play Media.

]]>

  • Subtitling

The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them


The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them


Video subtitling is instrumental in reaching global audiences, but can be a complex and nuanced media accessibility solution. Add captions to the equation, and it can become even more confusing for producers and creators of video content.

We know it’s easy to get bogged down with the different types of subtitles. That’s why we’re excited to debut our new eBook, The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them.

In this eBook, we compiled a comprehensive overview of the different types of subtitles based on the knowledge and experience of 3Play’s tenured subtitling experts. 

The Ultimate Guide to Subtitles covers the top subtitling solutions used across industries, including subtitles for D/deaf and hard of hearing (SDH), non-subtitles for the D/deaf and hard of hearing (non-SDH), and forced narrative (FN). Read on for a closer look into our extensive guide to all things subtitling.

Everything You Need to Know About Different Types of Subtitles 🌎

Discover the Different Types of Subtitles and How They Work

Learn all there is to know about subtitles in general. We provide an overview of their history, how they work, what they can look like, and how they’re encoded. Then, dig deeper and discover how SDH, non-SDH, and FN subtitles are defined.

Understand How Subtitling Types Compare

As mentioned above, subtitling is a nuanced solution and it can be difficult to wade through the different types without additional context. Explore in detail how each subtitling type compares to one another and how they stack up to captions. We even discuss why subtitles and captions have become so entangled in recent years and how you can better determine which media accessibility service you really need for your video.

Learn the Best Subtitling Type for Your Video

Each subtitling type has differing use cases and audience assumptions. In The Ultimate Guide to Subtitles, we cover the top use cases for SDH, non-SDH, and FN subtitles using examples that span across industries to help you find the best subtitling type for your video.

Resources

Gain access to a curated list of key 3Play Media resources for you to reference as you make accessibility part of your content production process.

The Ultimate Guide to Subtitles offers an in-depth exploration of the different types of subtitles, their functionality, and how they compare to captions. Using this knowledge and helpful use case examples, you will be able select the perfect subtitling solution for your media based on your viewers’ dynamic needs, no matter where they’re located in the world.

The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them. Download the eBook


About the author

Related Posts

The post The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them appeared first on 3Play Media.

]]>
SDH vs. CC: What’s the Difference? https://www.3playmedia.com/blog/whats-the-difference-subtitles-for-the-deaf-and-hard-of-hearing-sdh-v-closed-captions/ Mon, 06 Mar 2023 05:00:00 +0000 https://www.3playmedia.com/blog/whats-the-difference-subtitles-for-the-deaf-and-hard-of-hearing-sdh-v-closed-captions/ • The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them [Free Ebook] When it comes to media accessibility, one of the most common questions from television viewers revolves around the differences between subtitles and closed captions. But between the rise of streaming content and the global use of the...

The post SDH vs. CC: What’s the Difference? appeared first on 3Play Media.

]]>

  • Captioning

SDH vs. CC: What’s the Difference?


The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them [Free Ebook]


When it comes to media accessibility, one of the most common questions from television viewers revolves around the differences between subtitles and closed captions. But between the rise of streaming content and the global use of the term “subtitles” versus “captions,” the answer has become complicated.

As the lines between subtitles and captions continue to blur, perhaps none has become more confusing than the difference between subtitles for the d/Deaf and hard of hearing (SDH) and closed captions (CC). 

The issue of SDH vs. CC has been compounded by the availability of both options on certain streaming platforms. Adding further confusion, there’s also the matters of:

  • Mixed usage of terminology 
  • Different interpretations of what makes a timed text file SDH or CC
  • General misinformation on the purpose and function of SDH vs. CC files

This widespread confusion is precisely why we’ve decided to tackle SDH vs. CC in this blog. We’ll review the key differences between subtitles and closed captions, closely examine SDH subtitles, cover each of their respective roles and use cases, and explain why some streaming services are moving towards offering both options to viewers.

 

 

 

Looking for a described version of this video? We’ve got you covered!

Defining Subtitles and Captions

Before fully understanding the difference between SDH and closed captions, it is helpful to first understand the basic differences between subtitles and captions.

Person sitting between two boxes that read "CC" and "sub".

How are they alike?

Both subtitles and captions are timed text files synchronized to media content, allowing the text to be viewed at the same time the words are being spoken. Captions and subtitles can be open or closed.

How are they different?

In the United States and Canada, subtitles are intended for hearing viewers who do not understand the language. Traditionally, subtitles show the spoken content but not the sound effects or other audio elements. They are often used to refer to translations (think: subtitles for a foreign film.) In places like the UK, the term “subtitles” is used to describe both subtitles and captions.

Closed captions are designed for d/Deaf and hard-of-hearing audiences. They communicate all audio information, including sound effects, speaker IDs, and non-speech elements. They originated in the 1970s and are required by law for most video programming in the United States and Canada.

What are Subtitles for the d/Deaf and Hard of Hearing (SDH)?

It’s important to note that there are a few different types of subtitles. The most frequently used types are known as: SDH, non-SDH, and forced narrative (FN).

SDH stands for subtitles for the d/Deaf and hard of hearing. These subtitles assume the end user cannot hear the dialogue and include important non-dialogue information such as sound effects, music, and speaker identification. In the United States and Canada, SDH traditionally assumes the end user cannot understand the language being spoken, whereas traditional subtitles (also referred to as non-SDH) assume the viewer can hear the audio but doesn’t know the spoken language.

SDH often emulates closed captions on media that does not support closed captions, such as digital connections like HDMI or OTT platforms. In recent years, many streaming platforms, like Netflix, have been unable to support standard broadcast Line 21 closed captions. This has led to a demand for English SDH subtitles styled similarly to FCC-compliant closed captions instead. 

SDH can also be translated into foreign languages to make content accessible to d/Deaf and hard-of-hearing audiences who speak other languages.

Translation
Translation is often cited as a major difference between subtitles and captions. But can’t captions also be in other languages?

 

Yes! It’s common in the United States and Canada to find closed caption offerings in Spanish and French, along with other languages. The FCC even requires Spanish CC for all Spanish television programming in the US. There are limitations with translated closed captions, however. Because of CC’s line limits and lack of extensive international character support outside of Western languages, SDH subtitles are preferred to get the most accurate translations for d/Deaf and hard of hearing viewers across languages.

 

3Play Media Explains… SDH vs. CC – Watch the Video 👀

 

A Deep Dive into SDH vs. CC

SDH subtitles and closed captions are closely related, and there’s often confusion between the two. One of the main reasons? Preferred jargon.

The term “closed captions” has dominated the vernacular for nearly half a century in North America. The term “subtitles” has encapsulated any timed text format in the UK and other parts of the globe. 

But in recent years, rapid developments in streaming content and the globalization of media has shaken up the popular nomenclature across the world. This has left viewers and users of these accessibility services scratching their heads and wondering how SDH and CC are different.

Appearance

Example of SDH subtitles styled to closely resemble closed captions. Text reads "I'm street smart..." in white text on a semi-transparent black background.

SDH subtitles styled to closely resemble closed captions: white text with a semi-transparent black background.

SDH subtitles have a lot of flexibility in terms of appearance. They can be customized by professional captioners to look exactly like closed captions, or styled to match a customer’s request or platform’s specifications. 

Example of SDH subtitles styled to a standard subtitling appearance. Text reads "I'm street smart..." in white text, black outline, no background.

SDH subtitles styled to a standard subtitling appearance: white text, black outline, no background.

SDH subtitles’ appearance can sometimes be determined by a video player or platform, which sets the appearance independently of the original captioner. Occasionally, SDH can also be customized by the end user, but this varies based on the player or platform’s customization options.

 
Example of closed captions. Text reads "I'm street smart..." in white text on a black background.

Default closed captioning style: white text on an opaque black background.

By default, closed captions are displayed as white text on a black box, with placement that is customized on the captioner’s end. This has changed over the years with the introduction of digital television and 708 captioning standards, which allows for user customization.

User Customizations
When customization options are available to users, they can choose from a variety of font, sizing, and color options for SDH or CC. Customization options vary depending on the television, video player, or OTT platform capabilities.

Placement

SDH subtitles and closed captions are both capable of supporting placement. Viewers often find SDH and CC are placed in the bottom center, with movement to the top to avoid lower thirds. Some styles of CC may include horizontal placement to indicate speaker changes.

SDH can theoretically be placed anywhere on the screen if they are burned-in. As a best practice, SDH are typically centered for readability and ease in the translation process. 

Caption placement is usually implemented by a captioner and cannot be adjusted by the user unless the captions are formatted to 708 standards. According to FCC rules, captions must be positioned in such a way to avoid covering important lower third graphics.

Ultimately, SDH and CC positioning is dictated by the file type being used, or by the requested formatting specs from a platform or television network. 

Why are SDH and CC often centered?
Many streaming platforms and networks are moving towards center placement for both SDH and CC files for readability. It’s still common to encounter CC positioning to indicate speakers, but current trends point to left-justified, center-aligned SDH and CC.

 

Streaming services that follow this trend include Netflix and Amazon

Encoding

The move from analog television to high-definition (HD) media over the last 20 years had major implications for the encoding of closed captions and subtitles.

Standard 608 closed captions are transmitted via Line 21 as a stream of commands, control codes, and text. 708 closed captions are transmitted via MPEG-2 video streams in MPEG user data.

Subtitles, on the other hand, are often encoded as bitmap images – a series of tiny dots or pixels. And this method of transmission is a lot more compatible with newer digital media methods.

HD disc media, like Blu-ray, does not support traditional closed captioning but is compatible with SDH subtitles. The same goes for some streaming services and OTT platforms. SDH formats are increasingly used on these platforms due to their inability to support traditional Line 21 broadcast closed captions. That being said, some classic captioning formats, like SCC, have proven to be versatile across television and digital formats.

SDH vs. CC: At a Glance

Features SDH Closed Captions
Timed text synced to video
Can be turned on/off
In source language
Speaker Identification
Sound effects
Translation options Limited
Text appearance Varies; often white text on black or semi-transparent background to mimic captions Usually white text on black background
On-screen placement Varies; typically centered at the bottom, with movement to the top for lower third graphics Varies
Encoding Supported through HDMI Not supported through HDMI

 

Why Do Streaming Platforms Sometimes Include Both SDH and CC?

While many streaming and OTT platforms only offer one timed text option for viewers to use, some have started offering both SDH and CC options when available.

Apple TV+ is one of such platforms offering a wide array of accessibility choices for viewers on select programming. Depending on the program chosen, a viewer could find themselves choosing between CC and SDH. So why offer this?

Person thinking with text in a thought bubble: "English CC, English SDH, English non-SDH."

The answer can be different depending on the platform, but by offering both options, viewers are able to choose the format that they prefer. In situations where there is no distinction made between CC and SDH, the file could be considered one in the same. 

When both options are available to select, it’s often likely that the captions originate from a true CC file and are formatted to match that style; whereas the SDH could be a simpler timed transcript in the source language that was intentionally designed for translation into other languages. The difference between the two isn’t always clear when both are offered on a platform, but usually comes down to how each is displayed.

 
 
 

Closed captions and subtitles for the d/Deaf and hard of hearing are like siblings: closely related, with similar mannerisms, yet each has their own unique traits and appearance.

Like many media accessibility services, CC and SDH are nuanced and tricky to definitively declare as being one specific solution designed for one specific purpose. In the greater scheme of timed text files, either solution offered by a television network or streaming platform will provide an accessible experience for viewers.

Neither CC or SDH will ever fit neatly into one box, and it’s possible that defining them may only get more complicated as digital video evolves. But one thing remains certain for CC and SDH: they will always serve the d/Deaf and hard of hearing community first and foremost.


The Ultimate Guide to Subtitles: Different Types, How They Work, and When to Use Them. Download the eBook 

This blog was originally published by Lily Bond on May 21, 2014 as “How Subtitles for the Deaf and Hard-of-Hearing (SDH) Differ From Closed Captions.” This blog was updated on August 24, 2021 by Elisa Lewis and has since been updated again for comprehensiveness, clarity, and accuracy.


About the author

The post SDH vs. CC: What’s the Difference? appeared first on 3Play Media.

]]>
What are Forced Subtitles? https://www.3playmedia.com/blog/what-are-forced-narrative-subtitles/ Tue, 14 Feb 2023 14:52:13 +0000 https://www.3playmedia.com/blog/what-are-forced-narrative-subtitles/ • Download the [FREE] Checklist: Dubbing We previously covered SDH subtitles, non-SDH subtitles, and when they’re used; the difference between SDH subtitles and closed captions; and how subtitles vary from closed captions in general. That leaves us a common yet important subtitle type that most viewers never actually have to toggle on: forced narrative subtitles....

The post What are Forced Subtitles? appeared first on 3Play Media.

]]>

  • Subtitling

What are Forced Subtitles?


Download the [FREE] Checklist: Dubbing


We previously covered SDH subtitles, non-SDH subtitles, and when they’re used; the difference between SDH subtitles and closed captions; and how subtitles vary from closed captions in general. That leaves us a common yet important subtitle type that most viewers never actually have to toggle on: forced narrative subtitles.

A number of subtitling types exist in the world of video translation and localization services. The most commonly used subtitles include: 

  • Subtitles for the Deaf and Hard of Hearing (SDH)
  • non-Subtitles for the Deaf and Hard of Hearing (non-SDH)
  • Forced Narrative (FN) 

Forced narrative subtitles are crucial to supporting audience comprehension in a number of programs, regardless of the genre. So why is that? In this blog, we’ll explore what forced narrative subtitles are, what they look like, and when to use them.

What are Forced Subtitles, and What Purpose Do They Serve? 

Forced narrative (FN) subtitles, sometimes referred to as forced subtitles, are used to clarify pertinent information meant to be understood by the viewer. FN subtitles are overlaid text used to clarify dialogue, burned-in texted graphics, and other information that is not otherwise explained or easily understood by the viewer. Forced narrative subtitles are typically used in video translation and localization workflows to ensure any viewer can understand critical textual elements displayed on screen.

Forced narrative subtitles broaden the viewing experience across a wide range of countries, languages, and devices. FN subtitles are delivered as separate timed ­text files; therefore, they are not burned into the video. 

How are Forced Narrative Subtitles Different from Traditional Full Subtitles?
Forced subtitles clarify only the necessary information that would not be understood by the audience. The subtitles are “forced” because a viewer will not have to toggle the subtitles on to see them.
 

A full subtitle file translates the entirety of a program’s content, but must be toggled on by the viewer. It may or may not contain forced narrative content, depending on the viewing platform and other factors, such as timing. This means that information contained in a forced narrative, like the translation of a sign or other on-screen text that is normally not translated in full subtitle files, may not be included if dialogue is happening at the same time as the other text is displayed. Dialogue translation takes precedence over forced narrative elements in these cases.

What Do Forced Subtitles Look Like?

Many OTT providers will not display forced subtitles unless the Subtitles/CC settings are set to “off.” That being said, some platforms, like Netflix, incorporate forced narrative content into full subtitling and closed caption files.

When forced narrative subtitles are displayed on their own, their appearance can mirror that of typical subtitling or closed captioning files. And much like subtitles and captions, the visual appearance of FN subtitles varies depending on the platform, player, television, or other viewing device.

 

Adding dubbing or voice-over to your video? This checklist covers everything you need to consider 💬

How Are Forced Subtitles Used?

Forced narrative subtitles are commonly used in several scenarios. Let’s explore these different use cases for FN subtitles to better understand what they are and how they work.

Sporadic Foreign Language

Although a film may be in one source language, occasionally certain characters will use a phrase or short segment of a different language. 

Person speaking on the phone. A speech bubble above the person reads "Guten tag." Below, a forced narrative subtitle reads "Good afternoon."

One scenario might be a German character living in the United States who makes a phone call to a family member where they speak in German. If the information during this scene is important to the plot and overall understanding of the movie or show, FN subtitles will be used to translate the conversation.

 

Translation of Labels

Sometimes burned-in text graphics are used to enhance the viewing experience. Oftentimes, these are labels for locations, names, or dates. Since they are burned into the video in the original language, FN subtitles can be used to translate these into another language for viewers.

Silhouette of Boston, Massachusetts with Chinese characters written above it. Below, a forced narrative subtitle reads "Boston, Massachusetts."

This image showcases an example of a film containing a location label in the original language at the top. When shown in the United States, English FN subtitles would be used to translate the city name for English-speaking viewers to understand.

 

 

 

Other Forms of Communication

Forced narrative subtitles are helpful in cases where other forms of communication are showcased in a video, such as non-verbal communication formats like sign language; or fictional languages, such as Game of Thrones’ Dothraki or Elvish dialects in The Lord of the Rings.

Person using sign language with blackboard with a sketch of a tree behind them. Below, a forced narrative subtitle reads "Today we're learning about trees."

For example, if a character communicates in sign language, forced narrative subtitles would be used to clarify the meaning for viewers who aren’t familiar with the language. This example shows forced narrative subtitles below a teacher communicating via sign language.

 

 

Transcribed Dialogue

Sometimes forced narrative subtitles are used for transcribed dialogue in the same language. This is done to assist audience members when audio is inaudible or distorted.

Police cruiser chasing a car with explosions behind them. Below, a forced narrative subtitle reads "We're in pursuit!"

It may be hard to hear dialogue in an action movie with a lot of background noise, or in a documentary with poor audio quality. In either of these cases, FN subtitles could be used to clarify dialogue for the viewer.

 

 

Forced Narrative Subtitling with 3Play Media

Did you know 3Play Media creates forced narrative subtitles for video content?

Our experienced translation and subtitling team creates forced narrative subtitles for video content across networks and major OTT platforms. View our plans, and get in touch with 3Play Media to get started!

Not sure if you need forced narrative subtitles?

We’re here to help. Our team is filled with experienced localization professionals who have created countless SDH, non-SDH, and FN subtitling files for a variety of networks and streaming platforms. Reach out to begin scoping your project, and we’ll help determine if forced narrative subtitling is right for your content.

Dubbing Checklist: Get your free checklist

This blog was originally published by Elisa Lewis on December 8, 2017, as “What Are Forced Narrative Subtitles?” and has since been updated for comprehensiveness, clarity, and accuracy.


About the author

The post What are Forced Subtitles? appeared first on 3Play Media.

]]>
Why Reformatting is the Best Way to Edit Existing Captions and Subtitles https://www.3playmedia.com/blog/why-reformatting-is-the-best-way-to-edit-existing-captions-and-subtitles/ Mon, 09 Jan 2023 14:00:42 +0000 https://www.3playmedia.com/blog/why-reformatting-is-the-best-way-to-edit-existing-captions-and-subtitles/ • Download the [FREE] Checklist: Caption Reformatting Have you ever watched a rerun of your favorite television show with the original captions on and noticed that they don’t seem entirely correct? Closed captions could be delayed, covering graphics, paraphrasing dialogue…or maybe all of the above. Now, you may wonder how these captions slipped through the...

The post Why Reformatting is the Best Way to Edit Existing Captions and Subtitles appeared first on 3Play Media.

]]>

  • Captioning

Why Reformatting is the Best Way to Edit Existing Captions and Subtitles


Download the [FREE] Checklist: Caption Reformatting


Have you ever watched a rerun of your favorite television show with the original captions on and noticed that they don’t seem entirely correct? Closed captions could be delayed, covering graphics, paraphrasing dialogue…or maybe all of the above. Now, you may wonder how these captions slipped through the cracks of quality control (QC), but you could be surprised to learn that the captions likely didn’t make it through the QC process at all. Why? Because the captions were never reformatted to the video content they’re paired to.

Instead, an existing caption file (usually created to an older or original video version) was paired to an edited video–in this case, edited for a rerun on another network or streaming service–when it should have been professionally reformatted, which is the best way to edit existing closed captions or subtitles and truly ensure their accuracy and compliance.

Reformats are an extremely important captioning and subtitling service, yet are seldom discussed when it comes to media accessibility. Caption/subtitle reformats are a crucial step in the editing of existing captioning and subtitling files for content that’s been adjusted in any way–even for seemingly minor things such as the removal of commercial breaks. Videos with these kinds of changes usually involve an update to the original caption or subtitle file through reformatting.

What is caption or subtitle reformatting?

Reformats update a caption or subtitle file when a video has been changed or edited in some way that makes it different from the original video. Captions and subtitles need to match the video they are being paired to, and if the video is different from the one that the caption/subtitle file originated from, the captions/subtitles are going to be incorrect.

Put simply, if you edit your video to an updated version, it will probably impact the captions or subtitles.

Outdated caption/subtitle files that are eligible for reformatting can range from very minor and barely noticeable changes to egregious misalignments in dialogue and/or timing. In rare cases, a reformat may not be necessary, but this is only if the caption/subtitle file is not affected by the video changes.

It is important to note that reformats aren’t usually meant for revising simple spelling or grammatical mistakes, nor are they meant for a caption timed a few frames behind dialogue. Think of reformats as editing caption and subtitle files on a larger scale as compared to singular revisions of files. Sometimes the two services collide, but reformats generally take a bit more time, depending on the scope of changes required.

When do you need a reformat?

Reformatting is necessary when there are changes made to the content of a video. While it primarily affects broadcast and streaming captions/subtitles on television and OTT streaming platforms, reformats are suggested and often necessary to have accurate captions and subtitles on any updated video.

Can I do a reformat manually?
In the case of very, very minor changes such as a short word change or spelling/grammar correction, yes, you can manually edit a caption file. At 3Play, we only recommend manual revisions to caption files if:

A caption or subtitle file with significant changes to timing, transcription, or format should be handed off to professional captioners with experience in reformatting to ensure fully updated, compliant files.

 

How does reformatting work?

Reformats are completed by professional captioners who edit the caption or subtitle file alongside the updated video content until both are in sync and the content between the video and the caption/subtitle file match. Reformats are usually done within professional captioning software due to its ability to import a variety of file types and videos, allowing captioners to make the most efficient edits as possible.

The time it takes to reformat a file varies based on the changes required, but on average, most customers can expect a reformat to be completed in approximately half the time it takes to originate a caption or subtitle file. The larger or more numerous the changes are, the longer the reformatting process can take.

 

Do you need to update your existing caption files? 👀

 

Why do I need a reformat to edit my existing captions or subtitles?

There are many reasons why you may need a caption/subtitle reformat, but sometimes it can be difficult to know if you truly need one. So let’s review some of the top scenarios in which a reformat would be required for your content.

Re-timing

A person touching the hands of a clock.Making any sort of timing adjustments, whether it’s cutting material or adding material, necessitates a reformat. When you add or remove space to a video, even if it’s just for commercials and contains no dialogue, you still need to account for that updated timing in the caption or subtitle file so that it can be offset accordingly and properly synchronized.

Transcription adjustments 

Hands on a keyboardChanges to voice-over, dialogue, sound effects, and music need to be reflected in an updated caption or subtitling file. For example, swapping out music is fairly common on re-aired content for licensing reasons; if you change music–especially to a song with different lyrics or an entirely different mood–you need to ensure the caption/subtitle file captures this change in the transcription and timing.

Changes to graphics

Compilation of hands with graphics and textual imagery

The addition or removal of graphics, burned-in subtitles, or credits means that captions/subtitles must be manually adjusted by a captioner so that they don’t cover them. Because placement is an FCC requirement, it’s particularly important that a file is properly reformatted to accommodate these changes.

Profanity & censorship updates

Speech bubbles with abstract exclamationsProfanity and censorship guidelines can vary based on air time or distribution to other networks and platforms. It’s critical to ensure that the captions/subtitles match up to the audio when it comes to profanity, whether it is bleeped, dropped, or uncensored. Some broadcasters and networks can face potential penalties for this.

Video frame rate conversions

A person placing parts of a video togetherVideo frame rate conversions always require a reformat to adjust timing changes in the caption/subtitle file. Sometimes all that will change in a video is the frame rate itself during the editing process; however, this is a crucial adjustment to be made to the caption file, as a significant timing drift can occur when a file’s frame rate does not match the video’s frame rate. Frame rate changes can happen for a number of reasons, but are most common when prepping a video for online streaming or international distribution.

Outdated captions and subtitles

Wavy analog television bars and toneIt is uncommon to come across captions and subtitles that haven’t been updated to the FCC’s captioning quality standards, but it does occasionally happen. These caption and subtitle files were typically created prior to 2013 and may include older styles of formatting and paraphrasing. If captions and subtitles are verbatim, synchronized, and appropriately moved for graphics, these can still be acceptable for broadcast; otherwise, they need to be updated. Note that it is recommended to reformat outdated files, even if they meet current FCC standards, to achieve optimal readability and accessibility for viewers.

Reformatting with 3Play Media

Person at a computer with captioning software on screen

Did you know that 3Play Media provides reformatting services for existing caption and subtitle files?

Our experienced team of captioners can quickly and easily reformat any files you have in need of adjustment. Simply talk to our sales reps or account managers about our reformat add-on options to get started.

Not sure if you need a reformat?

We’re here to help. Our team is filled with experienced captioning professionals who have reformatted hundreds (even thousands!) of hours of caption/subtitle files for updated video content. Get in touch with us to begin scoping your project, and we can determine if reformatting is right for you.

Just want a quick fix?

Try our Caption & Subtitle Editor to quickly make spelling and other small adjustments to captions and translations.

 

Do your captions and subtitles need a refresh? Our Caption Reformatting Checklist can help! Free download.


About the author

The post Why Reformatting is the Best Way to Edit Existing Captions and Subtitles appeared first on 3Play Media.

]]>
SDH vs. non-SDH Subtitles: When to Use Them https://www.3playmedia.com/blog/sdh-subtitles-non-sdh-subtitles-when-to-use-them/ Mon, 29 Aug 2022 20:08:44 +0000 https://www.3playmedia.com/blog/sdh-subtitles-non-sdh-subtitles-when-to-use-them/ How to Write WebVTT Files [Free Guide] Do you ever watch a television show on a streaming platform with the subtitles turned on? Perhaps it’s an anime, and the subtitles translate Japanese dialogue into English. Or maybe it’s the English-language cerebral drama topping the watchlists of everyone you know. Well, hang on. Since the show’s...

The post SDH vs. non-SDH Subtitles: When to Use Them appeared first on 3Play Media.

]]>

  • Captioning

SDH vs. non-SDH Subtitles: When to Use Them


How to Write WebVTT Files [Free Guide]


Do you ever watch a television show on a streaming platform with the subtitles turned on?

Perhaps it’s an anime, and the subtitles translate Japanese dialogue into English. Or maybe it’s the English-language cerebral drama topping the watchlists of everyone you know. Well, hang on. Since the show’s in English, maybe they’re captions; they have speaker IDs and sound effects transcribed, after all. Subtitles don’t usually include those, right?

The answer is yes and no. There are two different types of subtitles: SDH and non-SDH. While these subtitles share many of the same characteristics, they serve different functions for their intended audiences. In this article, we’ll explore what SDH and non-SDH mean, when each is used, and how they differ from closed captions.

What are SDH subtitles?

SDH stands for subtitles for the d/Deaf and hard of hearing. These subtitles assume the end user cannot hear the dialogue and include important non-dialogue information such as sound effects, music, and speaker identification. In the United States and Canada, SDH traditionally assumes the end user cannot understand the language being spoken, as is typical with subtitles. This isn’t the case in places like the UK, where “subtitles” can also refer to “captions”. 

SDH Subtitles
Transcribed elements: Dialogue AND non-dialogue information (sound effects, music, speaker identification)

What are non-SDH subtitles?

Non-SDH, or non-subtitles for the d/Deaf and hard of hearing, are traditionally referred to as just “subtitles.” Non-SDH assumes the end user can hear the dialogue and understand the non-dialogue information but cannot understand the language. The only transcribed element of non-SDH is dialogue. On-screen graphics or words may also be transcribed, when time allows for the translation of these elements.

non-SDH Subtitles
Transcribed elements: Dialogue only
Optional transcribed elements: On-screen graphics or text, when time allows.

How do non-SDH subtitles differ from SDH subtitles?

Both SDH and non-SDH subtitling types have the same technical features in terms of characters per row, line limits, timing, and visual appearances. Both also have the goal of providing translations of dialogue into another language. The main difference lies within their audience assumptions. Put simply, non-SDH is targeted to a hearing audience, whereas SDH is targeted to a d/Deaf or hard of hearing audience.

How to write your own subtitle and caption files 🙌

Now that we’ve defined SDH and non-SDH subtitles and outlined the differences between the two, let’s consider the right use cases for each. Determining which format is best can be tricky and nuanced! We have outlined several fictional scenarios to demonstrate the considerations you should make when deciding what is best for your industry and content type.

When to use SDH Subtitles 

Scenario: Blu-ray release of sitcom requires a CC-to-SDH reformat

A major television network is releasing a special 10th-anniversary edition of a popular sitcom that formerly aired on their network. This edition will only be released in the US, but the studio handling the production of the Blu-ray has discovered that the closed caption files they have for the sitcom will be incompatible with this DVD type. They decide to have their broadcast closed caption files reformatted into SDH subtitle files to provide d/Deaf and hard of hearing audiences with a closed captioning alternative.

SDH Subtitles = Accessible captioning alternative
In our 2022 State of Captioning Report, we learned that 72% of respondents were captioning to provide access for d/Deaf and hard of hearing audiences. This major television network and their distribution studio know the importance of providing captions. By choosing SDH subtitles as their alternative for captions, they’re making their Blu-ray release fully accessible to a broader audience.

Scenario: SDH subtitles for virtual event recordings

US-based Company XY has been gearing up for its annual virtual event extravaganza. The event includes workshops, lectures, and webinars from industry experts and is a great opportunity for employees and customers to learn. While they’ve typically used their video platform’s auto caption feature for the live portion, they’ve received feedback from d/Deaf and hard of hearing attendees that they’d be interested in cleaner English captions or subtitles for on-demand event content after the fact. XY decides on English SDH subtitles for their post-event on-demand content so that they can meet the needs of their English-speaking d/Deaf and hard of hearing audience members, and subsequently use the SDH subtitles to later translate and better reach d/Deaf and hard of hearing multilingual audience members as well.

SDH Subtitles = Accessible virtual events for global audiences
A recent study by Grand View Research reported that the global virtual events market is projected to be worth $657.64 billion by 2030. Company XY’s commitment to their audience’s experience by providing live captions during the event, and then SDH subtitles in English and other languages after the event, will help them tap into the virtual events industry’s incredible projected growth while simultaneously creating an inclusive experience for their attendees.

 

When to use non-SDH subtitles

Scenario: MNC goes global with non-SDH Subtitles

Multinational Company X is headquartered in the US. X recently expanded their reach by setting up office locations in South America, Europe, and Asia. They only have training and corporate communication videos in English, but they recognize their need for localized subtitles.

non-SDH Subtitles = Breaking down employee language barriers
A 2015 study by Helene Tenzer and Markus Pudelko found that “language barriers can create emotional conflict between native and non-native speakers” at multinational corporations. By utilizing non-SDH subtitles for their internal videos, Multinational Company X is able to provide translations of content to help increase understanding and break down potential barriers for their employees, no matter where they’re located.

Scenario: MOOC expansion supported by non-SDH subtitles

University of Z is based in the US, and has been offering massive open online courses (MOOCs) in addition to traditional in-person classes for several years now. They recently decided to expand their offerings to reach not only English learners, but multilingual students as well. They’ve chosen to include non-SDH subtitles on all of their MOOC video content in Spanish, Portuguese, Mandarin, French, and German as a starting point.

non-SDH Subtitles = Increasing the reach of MOOCs
According to Roberto Rey Agudo, a language program director in the department of Spanish and Portuguese at Dartmouth College, DartmouthX saw a 50% enrollment increase for a course on the philosophy of science when Portuguese was added as a language of instruction. From this, it can be surmised that localized MOOC offerings, such as multilanguage subtitled videos, can have a huge impact on the reach of institutions like University of Z.

When to use both SDH and non-SDH subtitles

Scenario: Big OTT release requires SDH and non-SDH subtitles

Production Company Y is readying their newest sci-fi show for exclusive release on Very Big OTT Streaming Platform. Former shows they’ve created have gone to major broadcast networks in the US, so they’ve only ever ordered English closed captions to meet FCC requirements. Their post-production supervisors have learned that Very Big OTT requires both English SDH subtitles and non-SDH subtitles.

SDH & non-SDH Subtitles = Supporting OTT platforms’ accessibility needs
The number of digital video viewers worldwide is expected to hit nearly 3.5 billion by 2023, and much of this is fueled by the abundance of premium video content available via OTT media services. Many of these providers, such as Netflix, don’t support standard broadcast closed captions, leading to a demand for English SDH subtitles styled similarly to FCC-compliant closed captions instead. OTT providers often request non-SDH subtitles to help support translation workflows and international distribution needs as well. The nicest part about ordering both SDH and non-SDH subtitles together is that one can be used to create the other fairly easily, which saves both Production Company Y and Very Big OTT a lot of precious time!

What about closed captions?

So how do closed captions factor into all of this? Captions are closely related to subtitles, and there’s often confusion between the two, because SDH subtitles are often used in their place by media that doesn’t support broadcast closed captions, such as digital connections like HDMI and OTT media platforms. SDH also has the ability to visually mimic closed captions, further muddling how each are defined. However, captions and SDH differ in a few ways, including:

  • SDH subtitles have more room for transcription, supporting 42 characters per row to CC’s 32 characters per row limit.
  • SDH subtitles are timed much tighter to the audio than closed captions.
  • SDH subtitles vary between fonts, colors, and styles, and can look different depending on the media player or platform.

 

Create Your Own WebVTT Files. Download the Guide.


About the author

Related Posts

The post SDH vs. non-SDH Subtitles: When to Use Them appeared first on 3Play Media.

]]>
How to Translate Videos into Foreign Languages https://www.3playmedia.com/blog/how-to-translate-videos-into-foreign-languages/ Mon, 08 Nov 2021 14:03:59 +0000 https://www.3playmedia.com/blog/how-to-translate-videos-into-foreign-languages/ As the world continues to become more interconnected, it’s important to create content that has the ability to reach a global audience – which often entails translating your video content to a foreign language.  Technology has given people around the world access to online content like never before. This became even more apparent and important...

The post How to Translate Videos into Foreign Languages appeared first on 3Play Media.

]]>

  • Localization

How to Translate Videos into Foreign Languages

As the world continues to become more interconnected, it’s important to create content that has the ability to reach a global audience – which often entails translating your video content to a foreign language. 

Technology has given people around the world access to online content like never before. This became even more apparent and important during the COVID-19 pandemic. In 2016, video made up 72% of all global consumer internet traffic, and that number has increased over the past five years. Today, video traffic accounts for 82% of all internet traffic.

Top 10 countries with the most YouTube views: includes USA, Brazil, Russia, Japan, India, UK, Germany, France, Mexico, and Turkey

One of the largest and most popular video platforms in the world, YouTube, has amassed over a billion users worldwide. In 2021, there are approximately 1.86 billion YouTube users worldwide, up from 1.47 billion in 2017Although it was created in the United States, eight of the top 10 countries with the most YouTube users are in non-English speaking countries. 

There’s no denying the profound impact of globalization. Subtitles, translations of the dialogue into other languages, offer people around the world the capability to engage with your content. 

If you want to reach a global audience, we show you just how easy it is to translate videos into foreign languages. 

Why should you translate videos?

There are many benefits to translating videos into foreign languages, like making your content accessible, searchable, and engaging. For starters, we wouldn’t be able to binge-watch some of our favorite international films and shows. Two of my personal favorite foreign language shows on Netflix are Dark, a mind-boggling mystery-drama set in Germany, and Money Heist, a truly captivating Spanish drama about a mastermind who sets to pull off the biggest heist in history. 

Man says to boy,

As mentioned, translations allow your content to reach a wider audience, expanding your viewership to loyal viewers looking for the next binge-worthy show (including myself). People around the world are creating and viewing content, but unfortunately, videos aren’t always being translated into foreign languages. Content creators are missing out on the opportunity to get their videos in front of the eyes of billions of people who want to watch their content. 

In addition, videos in foreign languages increase SEO and video views. Just like with captions, the text file for translations helps search engine bots crawl for relevant keywords, helping your video rank higher in search engine results pages (SERPs) in those languages. If your content isn’t being translated into a foreign language, it will not show up in search results of the targeted country. It’s likely your competitors aren’t translating their content, so get ahead of the curve! 

 

 

Subtitles also give viewers the flexibility to watch videos in sound-sensitive environments. Our handy smartphones allow us to watch video anywhere, as long as there is internet access. How many times have you seen people watch videos on a noisy train or while working out at the gym? If viewers don’t have headphones, they can still watch videos and understand the auditory information without the need for sound. This helps to improve the user experience, giving non-English speaking viewers the potential to watch videos in any setting for longer periods of time.  

Lastly, but certainly not least, translations make your content accessible to the millions of people around the world with hearing loss. According to the World Health Organization, about 5% of the world’s population, or 466 million people, have hearing loss.  It is estimated that by 2050 over 700 million people – or one in every ten people – will have disabling hearing loss. Translations allow the deaf and hard of hearing community around the world to enjoy your content as well. 

 

How to translate video content:

Globe icon connected to multicolored squares

Now for the good stuff. Translating your video into a foreign language is easy once you tackle this first step. The typical and easiest way to create translations is to first create captions in the original language. 

Captions are not interchangeable with subtitles. Captions are a visual representation of the audio in a video and assume a viewer cannot hear. Subtitles, on the other hand, assume a viewer can hear but doesn’t understand the language.

Once you’ve created a caption file, it becomes much easier to create translations. Before publishing, make sure you check your final transcript for accuracy. If you’re working with a vendor, be sure to ask if they offer subtitles. 

DIY Translations

  1. Create captions in the original language. You can DIY them or use YouTube’s automatic captioning feature.
  2. Go to your Creator Studio on YouTube, then select Video Manager. Click the Edit button under the video you want to translate. 
  3. Go to the Subtitles/CC tab at the top of the video editor 
  4. Click Add New Subtitles or CC. A search bar will appear
  5. Search the language you wish to translate to 
  6. A new menu will appear. Click Create New Subtitles or CC
  7. You’ll be redirected to YouTube’s video editing interface. Above the transcript, you’ll see the button Autotranslate
  8. Your translation will appear under the original transcript. You can easily edit them right in the interface. 
  9. Finally, select Publish when you made sure your translations are accurate! 

Note: although creating translations in YouTube is a free solution, it’s notoriously inaccurate. What you save with money, you sacrifice with time and resources editing the translations. If your translations are inaccurate, not only are they distracting for viewers, but you can face a potential lawsuit for violating major international accessibility laws that protect people with disabilities. 

 

More YouTube hacks for captions and subtitles

Creating translations with a third-party service

At 3Play Media, our proprietary captioning & transcription software makes it easy for our customers to translate their content into over 40 languages, including Spanish, French, Japanese, and more.

We also provide a dual-language service, which gives users the ability to upload content containing more than one language (for example, English and Spanish) so that the corresponding caption file also contains these languages. After English, the second most popular language spoken in the U.S. is Spanish – it’s important that online video content reflects this bilingualism with accurate captions, and many vendors don’t offer this functionality.

To translate your videos with 3Play, simply:

  1. Upload your videos for captioning from a computer, via links, integrations, or custom APIs.
  2. Once your captioned file is complete, order your translations by clicking the More Actions tab next to your file, then selecting Order Services, and Translation.
  3. Select a Language from the 40+ languages available, and complete your order. Once completed, you can download your files in any format you need.

 

Screen capture of translation order form in the 3Play Account system

 

It’s that simple! Your job is to create engaging content – let us take something off your plate by ensuring your translations are accurate. 

We always put quality first. Translations are done by professional linguists that understand the cultural nuances of the language. Our linguists are experts in a wide range of topics like healthcare, technology, and finance. 

Our translations are not word-for-word or literal. They are read exactly how someone from that native country would speak. This ensures that the translations sound natural, and can be easily understood. 

When you have a 3Play account, you have access to a number of features and tools. The subtitle editor allows you to make changes or redactions to translations even after it’s been processed. 

The translation profile lets users complete a profile that gives context about the content and tells the translator more about your organization. This allows for more precise translations. 


Want to learn more about how you can incorporate translation into your video production process?

Read the checklist: How to Incorporate Translation into your Video Production Process

This post was originally published by Samantha Sauld on July 16, 2019 and has since been updated for accuracy, clarity, and freshness.


About the author

Related Posts

The post How to Translate Videos into Foreign Languages appeared first on 3Play Media.

]]>
Squid Game’s Subtitles: When Meaning Gets Lost in Translation https://www.3playmedia.com/blog/squid-games-subtitles-when-meaning-gets-lost-in-translation/ Thu, 14 Oct 2021 20:56:23 +0000 https://www.3playmedia.com/blog/squid-games-subtitles-when-meaning-gets-lost-in-translation/   Discover the Benefits of Translation and Subtitling   Squid Game, a shocking dystopian thriller about class and capitalism in Korea, is on track to become Netflix’s most-watched show of all time. Viewers in hundreds of countries are binge-watching the Korean-language drama while reading subtitles—a testament to Parasite director Bong Joon-ho’s 2020 Oscar acceptance speech....

The post Squid Game’s Subtitles: When Meaning Gets Lost in Translation appeared first on 3Play Media.

]]>

  • Subtitling

Squid Game’s Subtitles: When Meaning Gets Lost in Translation

 

Discover the Benefits of Translation and Subtitling

 

Squid Game, a shocking dystopian thriller about class and capitalism in Korea, is on track to become Netflix’s most-watched show of all time. Viewers in hundreds of countries are binge-watching the Korean-language drama while reading subtitles—a testament to Parasite director Bong Joon-ho’s 2020 Oscar acceptance speech. “Once you overcome the one-inch tall barrier of subtitles,” Bong said, “you will be introduced to so many more amazing films.”

While subtitling and dubbing make Squid Game watchable for millions of viewers, bilingual and multilingual Korean viewers have called out inaccuracies in the translation that significantly impact the show’s meaning.

In this blog, we’ll explore the controversy behind the show’s subtitles, the repercussions of inaccurate and culturally uninformed translation, and why subtitling can prove challenging for even the biggest of streaming companies.

Captions or Subtitles: Which are Viewers Using?

An essential piece of the Squid Game controversy is the difference between captions and subtitles. Though many people use the terms interchangeably, their differences are profound:  Captions are intended for viewers who can’t hear the audio in a video and include non-speech elements such as music and sound effects. Subtitles are intended for viewers who can’t understand the language in a video.Translation and subtitle symbol on video screen

Many Korean speakers have used social media to discuss Netflix’s Squid Game translation, sometimes inadvertently equating captions and subtitles. One of the most popular videos on TikTok about the translation has received over 12 million views. In the video, Youngmi Mayer, a comedian and podcaster fluent in Korean, discusses numerous examples in which the dialogue is “botched” and how the inaccuracies ignore meaningful cultural nuances and tropes. Mayer also tweeted, “If you don’t understand Korean, you didn’t really watch the same show,” pointing to the profundity of translation and cultural context.

However, Mayer later noted that she made her original comments after watching Squid Game with English closed captions rather than English subtitles, an option not available for d/Deaf and hard-of-hearing viewers. In Squid Game, the closed captions seem to follow the dubbing scripts. Dubbing, which involves recording a spoken audio track in a different language and overlaying it into the original video as a replacement for the speaker’s voice, is notoriously tricky to get right. Dubbed dialogue needs to match the original dialogue’s onscreen duration and an actor’s syllables and lip movements, so it’s challenging to fit in complex translations. 

Mayer later said Squid Game’s English subtitles are “substantially better” than the closed captions. However, she also said, “the misses in the metaphors—and what the writers were trying to actually say—are still pretty present.” Although Netflix’s subtitles provided superior translation to the captions, they still fell short on accuracy since they were missing cultural context. Additionally, Netflix provided an inferior viewing experience for people who rely on closed captions.


 Learn how to select the right translation vendor ✅ 


What Inaccurate Translations Can Mean for Viewers and Creators

Mayer’s comments hint at a larger issue with video translation and have spurred a meaningful conversation around how inaccuracies can impact a show’s meaning and creator’s intent.

Inaccurate and culturally uninformed translations can create a different viewing experience for non-Korean speakers watching Squid Game, which many argue changes the show’s meaning. For creators, inaccurate translations are an unfair and disrespectful representation of their story. Though “compromise is inevitable in subtitle translation,” according to Darcy Paquet, who composed the English subtitles for Parasite, translators must pay close attention to cultural nuance and expression so as not to alter a creator’s meaning.

Given Squid Game’s immense popularity, it’s hard not to wonder how much more successful the show could be if viewers had a better understanding of the original dialogue and cultural context.

Why is Translation So Challenging?

In Netflix’s 2020 review, the company noted that views of foreign-language titles were up over 50% from 2019 and that views of Korean dramas had tripled. Netflix is also making a significant investment in Korean language content—in early 2021, they announced plans to spend nearly half a billion dollars throughout the coming year on Korean content.

Speech bubbles

So, with all the money spent towards Korean-language content, why is Netflix, the world’s largest streaming service, falling short on translation?

While we can’t answer this question as it pertains specifically to Netflix, we can speak to the challenges of subtitling.

For both subtitles and closed captions, there are character limits so that audiences have enough time to read text on the screen as it appears. If one line of dialogue in Korean is best translated with four lines of dialogue in English, this translation would be hard to fit into the allotted time frame and character limit, hence the common occurrence of less accurate but more concise translations.

Another challenge with translation is cultural nuance. Though a translator may understand a language, gaining cultural nuance is often only possible through deep integration in a country and culture. Dialogue usually contains region-specific slang, idioms, and cultural references, which can be hard to explain to foreign viewers in a script’s allotted time frame. For this reason, it is imperative that translators thoroughly understand a region’s culture to translate content correctly.


 Learn the right questions to ask a translation vendor ➡ 


3Play Media’s Video Translation Service

At 3Play Media, we offer a robust video translation and subtitling service that gives our customers peace of mind. 

We integrate with over 20 major video players and platforms, seamlessly translating your content with little to no work on your end. With over 40 languages to choose from, you can ensure that your video content is understandable for a broad audience, no matter their language preference. 

Our translations are done by professional linguists who ensure your video content is translated accurately and maintain the same cultural context as the original language. 

Looking to partner with a video translation vendor that can achieve accuracy and cultural nuance? 3Play Media is here to help. Discover why more than 10,000 companies put their trust in us every year.


How to select the right translation vendor. Question you need to ask. Download the checklist.


About the author

Related Posts

The post Squid Game’s Subtitles: When Meaning Gets Lost in Translation appeared first on 3Play Media.

]]>
How to Choose Languages for Video Translation https://www.3playmedia.com/blog/how-to-choose-languages-for-video-translation/ Mon, 12 Jul 2021 17:31:09 +0000 https://www.3playmedia.com/blog/how-to-choose-languages-for-video-translation/ Incorporating Translation Into Your Video Production Process [Free Checklist] In our increasingly globalized society, it’s hard to overstate the importance of translation and subtitling. Although the number is constantly in flux, it’s estimated that there are over 7000 languages spoken worldwide.  Online video content is consumed in thousands of different languages every day. To make your...

The post How to Choose Languages for Video Translation appeared first on 3Play Media.

]]>

  • Localization

How to Choose Languages for Video Translation


Incorporating Translation Into Your Video Production Process [Free Checklist]


In our increasingly globalized society, it’s hard to overstate the importance of translation and subtitling. Although the number is constantly in flux, it’s estimated that there are over 7000 languages spoken worldwide

Online video content is consumed in thousands of different languages every day. To make your content accessible to viewers who speak other languages, you need to provide subtitles, translations of a video’s dialogue. Video translation has many benefits, such as making your content accessible, searchable, and engaging for a broader audience. However, video translation costs money, and providing subtitles for every language isn’t economically feasible. So, how do you choose the most suitable languages for video translation?

There’s no perfect answer, and every company’s choice will be different. In this blog, we’ll discuss some key metrics and considerations you’ll want to explore in determining which languages for video translation are right for you.

Where are your viewers or customers located?

The optimal languages for video translation will differ for every company depending on where most of their customers or viewers live. There are multiple ways to discover this information, but a straightforward way is to use Google Analytics to learn about your web traffic.

People standing by a globe saying hello in different languages

To access this information, open up Google Analytics and click on Reports. Under User, you can click User Attributes Overview, which will show your users’ demographic information for your specified timeframe. The overview shows your users’ countries, cities, and languages, and you can sort by all users, new users, and returning users. You can then click on demographic details for session engagement and conversion data.

Although it’s unnecessary to translate videos to every language your visitors speak, nor is it financially feasible, we recommend looking for locations and languages that garner significant web traffic.

In addition, you can pursue the strongest opportunities by looking at which locations and languages have the highest engagement—for example, through a lower bounce rate or higher pages per session. With this data, you can make a more informed decision for translating your videos into languages most beneficial for your business.

If you publish videos on YouTube, you can easily find where most of your viewers live by navigating to the YouTube Studio and Channel Analytics. Click on the Audience tab and scroll to the geography section, where you’ll find your visitors’ top locations for a given time period.

By discovering where your viewers live and what languages they speak, you can make a more informed decision when choosing languages for video translation. However, don’t assume that having few customers or viewers in a particular country means you shouldn’t translate into their primary language—sometimes, numbers from a certain country are low because you don’t have translation options available.

Where does your business operate?

In deciding which languages to translate your videos to, you must consider where your business operates. Translating your content into their main languages makes sense if you have global offices or employees. Large companies with offices worldwide rely on video translation to help disseminate and localize company content for employees and customers alike.

However, if you only operate in one country, don’t be deceived into assuming one language is sufficient. People speak many languages in different countries, and there are many benefits to translating your content, such as expanding your audience and boosting your video SEO.

 

 Learn how to select the right translation vendor for you ➡ 

 

Which languages have the highest potential for global reach?

Since the beginning of 2021, 3Play Media’s top 10 translation requests by the number of files ordered were for the following languages:

  1. Spanish
  2. French
  3. Japanese
  4. German
  5. Portuguese
  6. Italian
  7. Korean
  8. Chinese
  9. Polish
  10. Arabic

This data indicates a strong need for offering translation into these particular languages. Although we provide translation services for over 40 languages, these languages make up a large segment of our customers’ requests. Knowing this information can help us make more informed language translation decisions.

YouTube statistics can also provide insight into great translation opportunities. For example, according to Backlinko, the top five countries using YouTube in 2021 by the total estimated number of users are:

Country YouTube User
India 225 million
USA 197 million
Brazil 83 million
Japan 60 million
Russia 58 million

This data indicates that languages in India, English, Portuguese, Japanese, and Russian might be great languages for video translation. However, Backlinko elaborates and explains that the breakdown is different when analyzing the total number of video views from each country, which puts the US ahead of India and shows that the average user in the US is watching many more videos than the average user in India.

Language translation shown on a computer screen

According to Backlinko, the top 10 countries sorted by total viewers are the US, India, the UK, Brazil, Thailand, Russia, South Korea, Spain, Japan, and Canada. This data suggests Thai, Korean, and Spanish might be great options for translation, too.

Since 80% of views on YouTube come from outside the United States, translation and subtitling are imperative for reaching a global audience. Knowing which languages are used most often on your video streaming platform is a great way to make more informed decisions for video translation.

Consider ASL Interpretation

ASL, short for American Sign Language, is not only the sign language most commonly used by Deaf people in the U.S. but is also the third most commonly used language after English and Spanish in the U.S. While ASL is a relatively new language founded in the 1800s, it grows in usage each year—as of 2023, ASL is the third most popular language course at Yale.

Adding ASL interpretation to video and audio-only content is an effective way to reach new audiences and make your content more accessible. For example, at 3Play Media, we have added ASL interpretation to podcast episodes and webinars, which you can watch in action below:

In conclusion

Though there is no perfect answer for which languages to choose for video translation, following the tips mentioned above is a great place to start. If you deeply understand your audience and potential for global reach, you’ll be well-equipped to make informed video translation and subtitling decisions.


Get Started: Video Translation and Subtitling. Learn more.


About the author

Related Posts

The post How to Choose Languages for Video Translation appeared first on 3Play Media.

]]>