IBM's sign language technology

IBM researchers in Hursley, England, have developed an innovative technology called SiSi (Say It Sign It), which converts spoken words into British Sign Language (BSL). The technology can be of great help to deaf people.

Here's a quote from an article at The Press Association:

"This technology has the potential to make life easier for the deaf community by providing automatic signing for television broadcasts, and making radio news and talk shows available to a new audience over the Internet, or by providing automated voicemail transcription to allow them to make better use of the mobile network."

With the aid of this technology, digital characters can be used for converting spoken words into signed language, essentially digitizing the role of sign language interpreters. SiSi relies on the speech recognition for converting spoken words to text, and then the text is converted to gestures in a sign language that can be exhibited by a customizable virtual character or avatar.

The technology is not radically altering, but it is definitely a great boon to the differently-abled community. From presentations to mobile phones, the technology has numerous areas of application.

Check out these additional news links:


I can see why this innovation is considered by some as a good advance, but there are some things that just cannot be copied. (My mother is a teacher of the deaf/hard of hearing. She's taught me alot about their language and the community. I'm basing my points here off of American Sign Language, which I suspect is similar to BSL, but I'm not aware of what differences there are.) The big thing here is that there are actually 2 "dialects", if you will, of sign language - American Sign Langauge (ASL) and Signed English. Signed English is just that - you sign, word for word, what someone speaks. It's very proper and straight forward. However, ASL is in fact a different language, with a varied sentence structure that makes it different from straight forward English. Making the assumption that British Sign Language is in the same way it's own Language from the English spoken there, then this technology will miss alot of nuances that are important. Why are they important? Because many people (in the American deaf community, at least) speak ASL, and have a harder time understanding Signed English. Yes, they are the same words, for the most part, but the way they are presented in a sentence is very different and sometimes harder for them to comprehend. In the same way as we do not want to hear a computer translate a foreign language, such as French or Italian, into English because we will miss much of the fine details that are hidden in the language, I myself would not want to rely on a computer to translate any spoken word into sign language. I don't think that Interpretors around the world will be out of a job any time soon.

Editor's Picks