One of the possibilities of Artificial Intelligence is through automatic voice and text translations. It is true that the precision, although it improves with the passage of time, does not yet reach a professional level. But on a communicative level, and in certain languages, this technology meets the needs well.
There are more difficult languages than others, of course. Translating from English to French or Spanish is relatively simple for an automatic translator. But doing it from Arabic or Chinese to English is significantly more complicated. And with sign language, the task is twisted even more.
There are some initiatives aimed at recognizing sign language, but usually, require high power equipment. From the Google AI laboratory, they have created, however, a system capable of generating an accurate map of the hand and your fingers only with a smartphone.
The system is based on machine learning to capture hand movements. The camera of the smartphone and the power of the phone itself are sufficient to detect even the gestures of several hands. This is not easy, because in the movement the fingers collide with each other, are placed again and do everything very quickly.
A Thorough Training
Just as people need to study to learn a language, the system developed within Google to translate sign language needs training. To optimize its operation, it has been decided to recognize the palm, on the one hand. From there the fingers are analyzed separately.
Another algorithm is set in the image and assigns 21 coordinates. They are points that have to do with the positions, the distances of the knuckles and the tips of the fingers.
To make this happen, a thorough training was needed. The researchers manually added these 21 coordinates to 30,000 images of hands, with different signs and different light. This meticulously kneaded database is the one that nourishes the machine learning system.
All this is used to determine the position of the hand. From there, it is compared to a database of known gestures. The result is hopeful. The algorithms have yet to be polished, but they are a promise of improvement of existing systems to translate sign language. The most interesting: the low need for resources. Just a smartphone and your camera.