University of Washington engineers have developed a cell phone system for deaf people to use sign language when making phone calls. The problem with directly streaming video is that today’s technology often isn’t fast enough to provide high resolution at 30 frames per second, let alone bandwidth costs and drain on the battery. To overcome this, algorithms inside the phone identify hand motions and focus on transmitting those at the expense of the rest of what’s on the screen. 11 phones are currently being trialed by student’s at UW’s summer program for deaf and hard-of-hearing students.
The UW team estimates that iPhone’s FaceTime video conferencing service uses nearly 10 times the bandwidth of MobileASL. Even after the anticipated release of an iPhone app to transmit sign language, people would need to own an iPhone 4 and be in an area with very fast network speeds in order to use the service. The MobileASL system could be integrated with the iPhone 4, the HTC Evo, or any device that has a video camera on the same side as the screen.
More from University of Washington: Deaf, hard-of-hearing students do first test of sign language by cell phone…