Gestures can be made readable
For years scientists have worked to find a way to make it easier for deaf and hearing impaired people to communicate.
And now it is hoped that a new intelligent system could be about to transform their lives. Researchers have used image recognition to translate sign language into 'readable language' and while it is early days, the tool could one day be used on smartphones.
Scientists from Malaysia and New Zealand came up with the Automatic Sign Language Translator (ASLT), which can capture, interpret and translate sign language.
It has been tested on gestures and signs representing both isolated words and continuous sentences in Malaysian sign language, with what they claim are a high degree of recognition accuracy and speed.
Its creators say that it has the potential for use in multiple languages. The tool uses image processing and pattern recognition to translate actions into words.
"At the heart of the ASLT are real-time image processing and computational intelligence methods," said researcher Prof Rini Akmeliawati, of Malaysia's IIUM University.
"We developed a novel approach, leading to efficient detection and tracking of face, hands and upper body trajectories of a signer.
"By combining it with our tools for artificial intelligence-based matching between these sign trajectories and elements of a large database of images and video recordings of native signers, we have achieved a fast and flexible automatic sign language translation system.
"The system's potential lies in its technologically advanced algorithms and structure, which can be adapted to a multitude of the world's sign languages."
Everyday communication is a major challenge to a great many hearing-impaired people, as well as those unable to speak, all around the world.
The scientists believe that their creation will result in a portable, efficient and affordable ASLT for a wide variety of sign and written languages.