I have always admired individuals that are fluent in American Sign Language (or other versions). By the use of their hands, they are able to carry ongoing conversations and interpret feelings, messages, and stories. This kind of skill takes a lot of practice, and you must continue to learn new signs in order to be updated. Many times, it is very difficult to keep up with the growing vocabulary, due the low number of valuable resources and/or video tools available.
Many of the different signs are similar to one another, and the visual pronunciation is usually determined by the actual hand sign, arm movements, and even facial expressions. A new ASL video tool is being developed by the Boston University using a 3-year $900,000 grant and some enthusiastic interpreters.
The process for this tool has began with words being spoken to an interpreter, and four cameras recording the sign and movements for each specific word. One camera takes a side angle, two are close ups from the front, and the fourth provides a wide front view.
The new Sign Language tool is determined to convert a sign to its meaning. This in fact means that if an individual does not understand a certain sign, it may be replicated in front of a camera to find the actual meaning.
Since the deaf community has a significant size in the U.S and worldwide, such a tool could assist many who are in touch with deaf individuals, such as parents, students, interpreters, spouses and much more.
For more information, you can check Boston University ASL Linguistic Research Project.
Pics from Boston Univ and Christian Vogler