Kids are less engaged when they don’t see or hear themselves on screen. So naturally, that’s a barrier for millions of hard-of-hearing children around the world. But Ireland’s JAM Media is looking to change all of that with a new motion-capture project called Zaki Signs.
The studio is combining mocap, AI machine learning and Epic Games’ Unreal Engine animation tool to create—in real time—an animated character who provides American Sign Language (ASL) content interpretation.
Traditionally, adapting kids content for deaf viewers has either meant putting captions on screen (not super-helpful for younger children who can’t read yet) or a picture-in-picture setup with an ASL interpreter signing in a corner. And YouTube creators have adopted similar approaches.
But using adult interpreters feels like more of a barrier than a gateway when it comes to engaging kids, says studio owner John Rice. And that’s why JAM decided to focus on turning the human performer into an animated character (named Zaki) on screen.
JAM wants to have Zaki Signs ready to go by end of year, so it’s busy tweaking character designs to be sure the program can capture all the necessary nuances in facial expressions. Ultimately, the studio hopes to develop a product that can interpret in any sign language, not just ASL—and that’s critical, given that there are more than 70 million deaf people worldwide who use 200-plus sign languages, according to the World Federation of the Deaf.
But the number of viewers who would benefit from Zaki is potentially much higher if you also factor in kids with serious hearing loss—around 34 million globally, according to the World Health Organization.
JAM has already entertained significant cant interest from broadcasters and producers with large back catalogues of programming they’d like to make more accessible for this audience. “The reaction to Zaki so far has been out of this world,” says Rice. “We want to make it easy for producers to remove barriers to their content and create a bridge to reach deaf viewers.”
This isn’t the first time motion-capture technology has been leveraged for this purpose in recent years. Finnish pubcaster YLE experimented with it in 2019, and France’s MocapLab is currently tackling the challenge of enhancing the precision of mocap sign language. However, the task is especially difficult, given that signing requires complex finger movements and full facial expressiveness.
But the tech that underpins Zaki is now so well-developed that JAM can produce high-quality work in real time—and it has to be done in real time if it’s going to be commercially viable, says Rice. With JAM’s tools, a company can add Zaki to a 52 x 11-minute series in that same number of minutes. And the plan is to keep fine-tuning so that by the time Zaki launches, there won’t even be any need for animation cleanup work when it’s integrated.
Clients can also customize how the Zaki character appears, whether it’s in a picture-in-picture box or as a close-cropped image right on top of what’s unfolding on screen. Working with JAM is proportional, depending on the reach and scope of the project, but generally costs about 10% to 20% more than hiring a sign language interpreter alone (typically US$100 an hour or more). With Zaki, clients still hire the interpreter, but JAM provides the haptic gloves used in motion capture to render the human performer’s movements into animation, guides the interpreter to perform the movements and expressions that Zaki will overlay, films this, and then animates it using the Zaki character.
On the other side of the coin, JAM only needs an iPhone to capture the video instead of expensive cameras, and the goal is to eventually offer Zaki at the same cost that most signing companies charge for just the human interpreter.
Zaki’s engagement potential goes beyond the ASL-signing character’s ability to break down barriers—he could also be a buddy. Zaki will be able to watch TV with a child and react to the content on screen, making them feel like he’s part of their experience. And he might even spark an interest in learning to sign among hearing audiences, or come to life in his own branded content and consumer products.
An early backer of this technology is Screen Ireland, which gave JAM a portion of the US$432,000 in funding that it handed out last December to projects exploring new ways of using the tech to tell stories (enough to cover about 50% of the project’s budget at its pilot stage).
But JAM hopes to find additional development funding from NGOs and organizations dedicated to people who are deaf and hard of hearing, says Rice.
This story originally appeared in Kidscreen‘s May/June 2024 magazine issue.